February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley...

88
Belgium Testing Days 2013 Exploratory Automated Tests (Tutorial) February 27, 2013 Page 1 Exploratory Automated Testing Belgium Testing Days February 27, 2013 Tutorial Part 1 Douglas Hoffman, BACS, MBA, MSEE, ASQ-CSQE, ASQ-CMQ/OE, ASQ Fellow Software Quality Methods, LLC. (SQM) www.SoftwareQualityMethods.com [email protected] Douglas Hoffman Copyright © 2004-13, SQM, LLC. 1 About Doug Hoffman I am a management consultant in testing/QA strategy and tactics. I help plan quality strategies and tactical approaches for organizations, especially esoteric test automation. I gravitated into quality assurance from engineering. I’ve been a production engineer, developer, support engineer, tester, writer, instructor, and I’ve managed manufacturing quality assurance, software quality assurance, technical support, software development, and documentation. My work has cut across the industry from start-ups to multi-trillion dollar organizations. Along the way I have learned a great deal about software testing and automation. I enjoy sharing what I’ve learned with interested people. Current employment President of Software Quality Methods, LLC. (SQM) Management consultant in strategic and tactical planning for software quality Education B.A. in Computer Science MS in Electrical Engineering, (Digital Design and Information Science) MBA Professional Past President, Association for Software Testing Past Chair, Silicon Valley Section, American Society for Quality (ASQ) Founding Member and Current Chair, Santa Clara Valley Software Quality Association (SSQA) Certified in Software Quality Engineering (ASQ-CSQE) Certified Quality Manager (ASQ-CMQ/OE,) Participant in the Los Altos Workshop on Software Testing and dozens of other offshoots Douglas Hoffman Copyright © 2004-13, SQM, LLC. 2

Transcript of February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley...

Page 1: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 1

Exploratory Automated Testing

Belgium Testing DaysFebruary 27, 2013

Tutorial Part 1

Douglas Hoffman, BACS, MBA, MSEE,

ASQ-CSQE, ASQ-CMQ/OE, ASQ Fellow

Software Quality Methods, LLC. (SQM)

www.SoftwareQualityMethods.com

[email protected]

Douglas Hoffman Copyright © 2004-13, SQM, LLC. 1

About Doug HoffmanI am a management consultant in testing/QA strategy and tactics. I help plan quality strategies and tactical approaches for organizations, especially esoteric test automation. I gravitated into quality assurance from engineering. I’ve been a production engineer, developer, support engineer, tester, writer, instructor, and I’ve managed manufacturing quality assurance, software quality assurance, technical support, software development, and documentation. My work has cut across the industry from start-ups to multi-trillion dollar organizations. Along the way I have learned a great deal about software testing and automation. I enjoy sharing what I’ve learned with interested people.

Current employment

– President of Software Quality Methods, LLC. (SQM)

– Management consultant in strategic and tactical planning for software quality

Education

– B.A. in Computer Science

– MS in Electrical Engineering, (Digital Design and Information Science)

– MBAProfessional

– Past President, Association for Software Testing

– Past Chair, Silicon Valley Section, American Society for Quality (ASQ)

– Founding Member and Current Chair, Santa Clara Valley Software Quality Association (SSQA)

– Certified in Software Quality Engineering (ASQ-CSQE)

– Certified Quality Manager (ASQ-CMQ/OE,)

– Participant in the Los Altos Workshop on Software Testing and dozens of other offshoots

Douglas Hoffman Copyright © 2004-13, SQM, LLC. 2

Page 2: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 2

Today’s Topics

• Test automation

• Automated exploratory tests (Part 1)

• Ancillary test support

• Test oracles and automation

• Automated exploratory tests (Part 2)

• Results comparison

Douglas Hoffman 3Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 4

The Question About SUT

Misbehavior

What can happen if there is a bug in the SUT?

Anything!

Copyright © 2004-13, SQM, LLC.

Page 3: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 3

Douglas Hoffman 5

Automation Background

Working Concepts

Copyright © 2004-13, SQM, LLC.

Opportunities For Automation

• Program analysis

• Test design

• Test case management/selection

• Input selection/generation

• Automated test case

• Test execution control

• Actual results capture

• Expected results generation

• Results comparison

• Report generationDouglas Hoffman 6Copyright © 2004-13, SQM, LLC.

Page 4: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 4

Automation Defined For This Class

Get the computer to do one or more of:

− Input selection/generation

− Automated test case

− Test execution control

− Actual results capture

− Expected results generation

− Results comparison

Douglas Hoffman 7Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 8

Automated vs. Manual Tests

An automated test is not equivalent to the most

similar manual test:

– Automated comparison is typically more precise

(and may be tripped by irrelevant discrepancies)

– Skilled human comparison samples a wider range

of dimensions, noting oddities that one wouldn't

program the computer to detect

– Automated input is consistent, while humans

cannot closely replicate their test activities

Copyright © 2004-13, SQM, LLC.

Page 5: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 5

Douglas Hoffman 9

Automation Narrows Our Scope

An automated test is more limited than a

manual test:

− The test exercise must be automated in advance

− People can integrate outside experience

− People may gain insights

− We can only check machine available results

− We only check result values that we pre-specify

− Cannot easily recover when something

unexpected happens

Copyright © 2004-13, SQM, LLC.

Typical Automated Tests

• Mimic a human tester’s actions to repeat and

speed up manual testing

• Work at the UI level

• Are a list of test activities (a script)

• Check results at specified points in the script

• Are based on the functions of a test tool

Douglas Hoffman 10Copyright © 2004-13, SQM, LLC.

Page 6: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 6

Regression Test or Demo?

Regression Test:• Think of an interesting test

• Map out steps to perform the

test

• Run the steps manually

− if there is a bug; report

and wait until fixed

− if no bugs, capture the

steps

• Rerun the steps to repeat the

test

Demo:• Think of an interesting demo

• Map out steps to perform the

demo

• Run the steps manually

− if there is a bug; report

and wait until fixed

− if no bugs, capture the

steps

• Rerun the steps to repeat the

demo

Douglas Hoffman Copyright © 2012, SQM, LLC. 11

The goal is to find bugs The goal is to AVOID bugs

Valuable Regression Automation

• Build/smoke tests

• Required repeatability

• Rerun many times

• Demo script

• Comfort managers

Douglas Hoffman 12Copyright © 2004-13, SQM, LLC.

Page 7: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 7

Douglas Hoffman 13

Questions We Should Ask

About Testing and Automation

• Should we limit our thinking to what a tool does?

• Should we focus automation on things we can do

manually and script?

• Do speedy manual tests find more or different

bugs than manually running tests?

• Should we limit ourselves to UIs or APIs?

• Are we checking everything that’s important?

• Can inefficient or approximate tests be valuable?

• Must tests do the same things every time?

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 14

Exploratory Automated Tests

Part 1: Introduction

Copyright © 2004-13, SQM, LLC.

Page 8: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 8

Douglas Hoffman 15

Examples of Exploratory Automation

• Cem Kaner’s “Telenova” example using random events

• Statistical packet profiles for data link testing

• Using “Dumb monkeys” and MS Word

• “Sandboxed” random regression tests

• 1/3 or 3x heuristic in a test harness

• Periodic database unload/check

• Database links checking

• Database locking (single and multi-threaded)

• Long walks for a device front panel state machine

• Database load/unload dropouts and shifts

• Database unbalanced splitting

• Random machine instruction generation

Copyright © 2004-13, SQM, LLC.

The Easy Part of Testing

• Driving the inputs or generating data is

usually straightforward

• Design of tests that can reveal bugs is more

difficult

• The hard part is telling whether or not the

SUT is misbehaving (the oracle problem)

Exploratory Automated Testing is about

solving all three parts

Douglas Hoffman Copyright © 2004-13, SQM, LLC. Slide 16

Page 9: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 9

Douglas Hoffman 17

What Makes Automation

Exploratory?

• Enables and amplifies tester’s abilities

• Does something different every time

• May use massive numbers of iterations

• May use multiple parallel oracles

• Can find bugs we never imagined

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 18

Enable the Exploratory Tester

• Enables a human tester to do more things

• Enables a human tester to work faster

• Amplifies a human tester’s abilities

• Can look under the covers

• Can provide real time monitoring

Copyright © 2004-13, SQM, LLC.

Page 10: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 10

Douglas Hoffman 19

Randomness and

Exploratory Tests

• The power of doing new things

• Random number generators

• Randomized input values

• Randomized data generation

• Random test selection

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 20

Advantages of Exploratory

Automation

• Supplement the baseline tests

• Does things a human tester cannot do

• Does something different each time

• May use massive numbers of iterations

• May feed inputs directly to the SUT

• Oracles may check internal information

• May run multiple parallel oracles

• Can uncover obscure bugs

• Can uncover bugs impossible to find manuallyCopyright © 2004-13, SQM, LLC.

Page 11: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 11

Douglas Hoffman 21

Disadvantages of Exploratory

Automation

• May not be repeatable (even with seeds)

• Difficult to capture program and system

information for diagnosis

• Coordination of test activities and

multiple, autonomous, real-time oracles

• Does not provide rigorous coverage

• Can uncover bugs that can’t be fixed

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 22

A Model of Test Execution

Setting The Stage For Oracles

Copyright © 2004-13, SQM, LLC.

Page 12: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 12

Douglas Hoffman 23

Oracles and Test Automation

Good automated testing depends on our ability

to programmatically detect whether the SUT

behaves in expected or unexpected ways

Our ability to automate testing is

fundamentally constrained by our

ability to create and use oracles.

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 24

The Oracle

• The principle or mechanism for telling

whether the SUT behavior appears OK or if

further investigation is required

• Answers the question “is this expected or

acceptable behavior?”

• A fundamental part of every test execution

• Expected result is NOT required in advance

• Multiple oracles can be used for a test

Copyright © 2004-13, SQM, LLC.

Page 13: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 13

Douglas Hoffman 25

Test Execution Model

System

Under

Test

Test Inputs

Input Data

Program State

Environmental

Inputs

Test Results

Postcondition Data

Post-condition

Program State

Environmental

Results

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 26

Implications of the Model

• The test exercise is usually the easy part

• We don’t control all influences

• We can’t check everything

• Multiple domains are involved

• We can’t know all the factors

Copyright © 2004-13, SQM, LLC.

Page 14: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 14

Douglas Hoffman 27

Notes On Test Oracles

• At least one type is used in every test

• Implemented oracles may have some characteristics of multiple types of oracles

• Multiple oracles may be used for one test

• Oracles may be independent or integrated into a test

• They may be synchronous or asynchronous

• The oracles are key to good automation

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 28

Important Factors

in Outcome Comparison

• Which outcomes do we check?

• How do we know what to expect?

• Do we know how to capture it?

• Can (or should) we store the results?

• What differences matter?

• Are “fuzzy” comparisons needed?

• When do we compare outcomes?

Copyright © 2004-13, SQM, LLC.

Page 15: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 15

Douglas Hoffman 29

Exploratory Automated Testing

Ansilary Support

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 30

Keys to Support of EAT

• The oracles are key to identifying program

misbehavior

• Logging of important data for anomaly

analysis is critical

• Coordination of oracles with test is desirable

• Keep reported data consistent – a human is

going to be involved when anomaly detected

• Systems must have similar configurations

Copyright © 2004-13, SQM, LLC.

Page 16: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 16

Douglas Hoffman 31

Management Support

• If management values testing for important

bugs, they should consider EAT as valuable

– Understand how much effort EAT deserves

– Keep the test count and flagged errors out of

metrics (until a bug is confirmed)

• If management does not understand the value

of EAT

– Educate them if possible

– Possibly covert measures are appropriate

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 32

Administrative Support

• The systems must be configured similarly

– Drives

– Versions of OS (except when testing different

platforms)

– Data set locations

– Permissions (except when testing them)

– Background processes (except oracles)

• Updating needs to be managed centrally

Copyright © 2004-13, SQM, LLC.

Page 17: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 17

Douglas Hoffman 33

Development Guidelines

• Close support from development is

extremely valuable for EAT

• Some EAT can be done without direct

support from development

– Ask two questions of the developers:

• What data do you need if I find a potential error?

• How can I get that data for you?

– Use a black box (or grey box) approach

– Limit background processes (except oracles)

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 34

Software Test Automation:

About Test Oracles

Copyright © 2004-13, SQM, LLC.

Page 18: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 18

Douglas Hoffman 35

Defining A Test Oracle

The mechanism or principle by which we determine whether or not the

software under test (SUT) isbehaving reasonably.

– Unreasonable SUT behavior requires investigation by a person.

– We just move on when the oracle says the behavior is unremarkable.

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 36

Test Execution Model

System

Under

Test

Test Inputs

Input Data

Program State

Environmental

Inputs

Test Results

Postcondition Data

Post-condition

Program State

Environmental

Results

Copyright © 2004-13, SQM, LLC.

Page 19: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 19

Douglas Hoffman 37

The Test Oracle

Different views on the use of oracles:

– Reference Function: You ask the oracle what the “correct” answer is and it gives you outcomes

– Evaluation Function: You ask the oracle whether the program behavior should be investigated independent of inputs

– Reference and Evaluation Function: You ask the oracle whether the program behavior is nominal or abnormal given inputs and outcomes

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 38

Types of Comparisons

Comparing the program’s behavior to a

reference of expected behavior can be

inexact.

– Deterministic oracle (mismatch means

behavior is abnormal)

– Probabilistic oracle (means behavior is

probably abnormal)

Sometimes comparison is done in the oracle.

Copyright © 2004-13, SQM, LLC.

Page 20: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 20

Douglas Hoffman 39

Which Outcomes To Check

• Expected results

• Anticipated likely errors

• Available easy oracles

• Major environmental factors

• Invariants

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 40

Checking Results

• Exact comparison− Values− Ranges − Behaviors − Key words/values

• Heuristic comparison− Similarity− Algorithm based− Secondary characteristics

• Check during or after the test run

Copyright © 2004-13, SQM, LLC.

Page 21: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 21

Douglas Hoffman 41

Software Test Automation:

Oracle Characteristics

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 42

Oracles: Challenges

� Completeness of information

� Accuracy of information

� Usability of the oracle or of its results

� Maintainability of the oracle

� Complexity when compared to the SUT

� Temporal relationships

� Costs

� Value (ROI)

To 127

Copyright © 2004-13, SQM, LLC.

Page 22: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 22

Douglas Hoffman 43

Oracle Completeness

• Input Coverage

• Result Coverage

• Function Coverage

• Sufficiency of the checks

• Types of errors it might detect

• SUT environments it works for

There may be more than one oracle for the SUT

Inputs may affect more than one oracle

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 44

Oracle Accuracy

• How similar to SUT− Arithmetic accuracy

− Statistical similarity

• How extensive− The more ways in which the oracle matches

the SUT, i.e. the more complex the oracle, the

more errors

• What types of possible errors − Misses

− False alarmsCopyright © 2004-13, SQM, LLC.

Page 23: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 23

Douglas Hoffman 45

Oracle Accuracy

• Independence from the SUT

− Algorithms

− Sub-programs & libraries

− System platform

− Operating environment

Close correspondence makes common mode faults

more likely and often reduces maintainability

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 46

Oracle Usability

• Form of information

− Bits and bytes

− Electronic signals

− Hardcopy and display

• Location of information

• Data set size

• Fitness for intended use

• Availability of comparators

• Support in SUT environments

Copyright © 2004-13, SQM, LLC.

Page 24: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 24

Douglas Hoffman 47

Oracle Maintainability

• COTS or custom

− A custom oracle can become more complex

than the SUT

− More complex oracles make more errors

• Keeping correspondence through changes

− Test exercises

− Test data

− Tools

• Ancillary support activitiesCopyright © 2004-13, SQM, LLC.

Douglas Hoffman 48

Oracle Complexity

• Correspondence with SUT

• Coverage of SUT domains and

functions

• Accuracy of generated results

• Maintenance effort necessary to keep

correspondence through SUT changes

• Difficulty of ancillary support activities

Copyright © 2004-13, SQM, LLC.

Page 25: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 25

Douglas Hoffman 49

Oracle Temporal Relationships

• Speed of results generation

• Speed of comparison

• When the oracle is run

• When results are generated

• When results are compared

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 50

Oracle Costs

• Creation or acquisition costs

• Maintenance of oracle and comparators

• Execution cost

• Comparison cost

• Additional analysis of errors

• Cost of misses and false alarms

• Cost of ancillary activities

Copyright © 2004-13, SQM, LLC.

Page 26: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 26

Douglas Hoffman 51

Oracle Value

• Value and costs are compared with ROI

• Financial ROI computed with cost savings

• Test ROI is based on value of information

− Total cost of a test (create, check, misses,

false-alarms, opportunity, etc.)

− Real value of the information (required

knowledge, insignificant, etc.)

− Not easily valued in monetary form

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 52

Software Test Automation:

Types of Test Oracles

Copyright © 2004-13, SQM, LLC.

Page 27: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 27

Types of Test Oracles

• No Oracle

• Independent

Implementation

• Consistency

• Self-Verifying

• Model-based

• Constraint-Based

• Probabilistic

• Statistical

• Property-Based

• Computational

• Diagnostic

• Hand-crafted

• Human

Douglas Hoffman 53Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 54

Notes About ‘Heuristic Oracles’

I used to talk about heuristic oracles, which were ones that used approximations and rules of thumb. I have refined the descriptions and no longer differentiate between heuristic and other oracles. Some oracles are based on heuristic principles while others are not. But, All oracles are “heuristic” with regard to pass and fail. That is:

– Passing a test doesn’t mean there isn’t a bug here

– Failing a test doesn’t mean there is a bug present

– No oracle is complete – there are always ways the SUT

could be failing in other unnoticed ways

Copyright © 2004-13, SQM, LLC.

Page 28: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 28

Douglas Hoffman 55

Oracle Taxonomy (1 of 4)

Definition Advantages Disadvantages

No Oracle - Doesn’t check correctness of results, (only that some results were produced)

- Can run any amount of data

(limited only by the time the SUT

takes)

- Only spectacular failures

are noticed

Independent

Implementa-

tion

- Independent generation of all expected results

- No encountered errors go

undetected

- Can be used for many different

tests

-Fast way to automate using an

oracle (if available)

- Expensive to implement

and maintain

- Complex and often time-

consuming when run

Consistency

(Saved

Master)

- Verifies current run results with a previous run

- Verification is straightforward

- Can generate and verify large

amounts of data

-Original run may include

undetected errors

-Can take large data

storage space

Consistency

(Function

Equivalence)

- Verifies current run with another implementation

- Verification is straightforward in

real time

- Can generate and verify large

amounts of data without storage

- Possibility of undetected

common-mode errors

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 56

Oracle Taxonomy (2 of 4)

Definition Advantages Disadvantages

Self-Verifying - Embeds key to the answer within data

-Can check large amounts of data

- Allows extensive run-time and post-test

analysis

- Straightforward to check

- Verification can be based solely on

message contents

- Test must require input

data set with compliant

structure

- Must define answers and

generate messages to

contain them

Model Based - Uses digital data model of SUT behavior

- May use digital model for multiple tests

- Digital form of model easier to maintain

than automated test

- Tests may work for multiple SUTs by

using different models

- Maintenance of complex

SUT models is expensive

- Model must match

expected behavior

Constraint

Based

- Verifies characteristics in compliance with constraints

- Faster and easier than most other oracles

- Less expensive to create and use

- May be reusable across SUTs and tests

- Can miss systematic

errors

- Can miss obvious errors

Copyright © 2004-13, SQM, LLC.

Page 29: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 29

Douglas Hoffman 57

Oracle Taxonomy (3 of 4)

Definition Advantages Disadvantages

Probabilistic - Checks a highly likely characteristic

- Faster and easier than most other oracles

- Less expensive to create and use

- Can cause false alarms

- Can miss systematic

errors

- Can miss obvious errors

Statistical - Uses statistical properties of outcomes

- Allows checking of very large data sets

- Allows checking of live systems’ data

- Sometimes allows post-test checking

- Can cause false alarms

- May miss systematic

errors

- Can miss obvious errors

Property

Based

- Checks an indirectly correlated characteristic

- Usually simple

- Good for straightforward

transformations

- Correlated variable may

not exist

- May be difficult to

exploit the correlation

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 58

Oracle Taxonomy (4 of 4)

Definition Advantages Disadvantages

Computational - Reverses the behavior of the SUT to revert results to inputs

- Good for mathematical functions

- Good for straightforward

transformations

- Limited applicability

- May require complex

programming

Diagnostic - Programmatically checks assertions or traces activities and values

- May be built into code as defensive

programming

- Provides good diagnostic data when

exception detected

-Can miss significant

errors

- Changes program

characteristics

- Can miss obvious errors

Hand-Crafted - Result is carefully selected by test designer

- Useful for some very complex SUTs

- Expected result can be well

understood

- Does the same thing

every time

- Limited number of cases

can be generated

Human - Applies a person’s brain power to decide correctness

- Available

- Flexible (can always be applied)

- Applies a broad spectrum of filters

- Slow

- Error prone

- Easily distracted

Copyright © 2004-13, SQM, LLC.

Page 30: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 30

Douglas Hoffman 59

‘No Oracle’

• Definition:

− No check for correctness – just

run the test exercise (watch for

crashes)

• Examples:

− Random keyboard input

− API calls with no return checks

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 60

‘No Oracle’

• Advantages:

− Easy to implement

− Runs fast

• Disadvantages:

− Only spectacular events are noticed

− Nothing found ≠ No problem

− False sense of accomplishment

Copyright © 2004-13, SQM, LLC.

Page 31: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 31

Douglas Hoffman 61

‘No Oracle’ Strategy

• Run the test using generated [usually

random] inputs

• Don’t worry about SUT behavior since

we’re only looking for spectacular results

• ROI based on really inexpensive input

generators

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 62

‘No Oracle’ In Context

• May be useful:− Early development testing

− Robustness testing

− Load or life testing

− Security testing (“fuzzing”)

• Usually avoided:− Used as a primary test mechanism

− When results must be correct

− Input is non-trivial to generate

Copyright © 2004-13, SQM, LLC.

Page 32: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 32

Douglas Hoffman 63

‘Independent Implementation’

Oracle

• Definition:

− A separate program created to generate

expected results

• Examples:

− Adder function written to check a calculator

− An alternative algorithm implementation

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 64

‘Independent Implementation’

Oracle

• Advantages:

− Customized for the SUT

− Can be exact for some SUT characteristics

• Disadvantages:

− Often expensive to implement

− May be difficult to run

− May have high maintenance costs

Copyright © 2004-13, SQM, LLC.

Page 33: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 33

Douglas Hoffman 65

‘Independent Implementation’

Strategy

• Independent implementation

• Complete coverage over some domain

− Specific functions

− Input ranges

− Result ranges

• Generates “Correct” results

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 66

‘Independent Implementation’

In Context

• May be useful:− Inexpensive to create or acquire

− Extremely high reliability required

− Reasonably good coverage of some range

• Usually avoided:− Too expensive to create or acquire

− Areas where exactness is not required

− High risk that the oracle may be wrong

− Other types of oracles are sufficient

Copyright © 2004-13, SQM, LLC.

Page 34: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 34

Douglas Hoffman 67

‘Consistency’ Oracle

• Definition:− Compare results between different programs or

from run-to-run to identify when values change − Really only indicates that something is different

− Usually don’t know whether or not original results

are “correct”

• Examples:− Comparison of test results with “golden master”

− Function equivalence testing

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 68

‘Consistency’ Oracle

• Advantages:− Common, straightforward approach

− Usually easy to implement using log files

− Logs can be any data types

− Can check without knowing the correct results

• Disadvantages:− Legacy bugs may be there but undiscovered

− Established bugs may “get tenure”

− Nothing found means no obvious changes

Copyright © 2004-13, SQM, LLC.

Page 35: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 35

Douglas Hoffman 69

‘Consistency’ Oracle Strategy

• Check for changes or differences

− Against previous results (“Golden Master”)

− Against alternate versions or platforms

− Function equivalence testing

• Results comparison with:

− Validated outcomes

− Unvalidated outcomes

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 70

‘Consistency’ Oracles In Context

• May be useful:

− Inexpensive to create

− Have automated tests with logging

− High likelihood that the oracle is correct

− Equivalent product is available

• Usually avoided:

− High risk that the oracle may be wrong

− Other types of oracles are available

Copyright © 2004-13, SQM, LLC.

Page 36: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 36

Douglas Hoffman 71

‘Self-Verifying Data’ (SVD) Oracle

• Description:

− Use a “key” to tag data when generated and later

use the tag to verify data correctness

− Self-descriptive data (Red)

− Cyclic algorithms (repeating patterns)

− Tagged data (CRC)

− Embedded keys with shared algorithms

− SVD can be used for data sets or input data

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 72

‘Self-Verifying Data’ (SVD) Oracle

• Advantages:− Excellent for random data generation

− Generates verifiable records

− Can be used to quickly check large data sets

− Allows extensive results analysis during or

after test execution

• Disadvantages:− Persistent, non-transformed data required

− Requires some place to embed a key

Copyright © 2004-13, SQM, LLC.

Page 37: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 37

Douglas Hoffman 73

‘SVD’ Strategy

• Embed key information within the data to verify data integrity− Human readable form (self-descriptive)− Data pattern based (cyclic)− Embed unique summary (CRC)− Embedded seed or key

• Data verification− Recreate data− Extract the pattern− Synchronous or asynchronous checking

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 74

‘SVD’ In Context

• May be useful:− High volume of inputs or referenced data

− Key or seed can be used for data regeneration

− Straightforward to incorporate key with data

• Usually avoided:− Outcomes don’t reflect SVD data

− Data not easily generated from a key

− Key not easy to include with data

− Overly complex data structures

Copyright © 2004-13, SQM, LLC.

Page 38: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 38

Douglas Hoffman 75

‘Model-Based’ Oracle

• Definition:

− Use a model of some SUT behavior to structure

a test and provide the oracle

• Examples:

− Menu tree

− Internal state machine

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 76

‘Model-Based’ Oracle

• Advantages:

− Decouples tests from SUT models

− Works well for families of similar products

− Provides test ideas and oracles

• Disadvantages:

− SUT must have model-based behaviors

− Models must be simple to be practical

− Often requires ability to detect program state

Copyright © 2004-13, SQM, LLC.

Page 39: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 39

Douglas Hoffman 77

‘Model Based’ Oracle Strategy

• Identify and describe a machine readable form of a

model of some aspect of the SUT

• Design tests using the model as input

• Implement the tests (by reading in the model)

• Use the model information to verify actions

• Update the model or use multiple models as needed

Copyright © 2004-13, SQM, LLC.

Slide 78

‘Model-Based’ Tests and Oracles

• Describe a model of the software (e.g., menu tree)

• Describe the model in a machine readable form

• Verify the correctness of the model with intended

behavior

• Write a test using the model as input, as a guide for

generation of events, and monitoring of responses

• Generate events / inputs to the program from the

model

• Check program functions and state transitions based

on the model

Douglas Hoffman Copyright © 2004-13, SQM, LLC.

Page 40: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 40

Douglas Hoffman 79

Some Types of Models

• State Diagrams

• State Transition Tables

• Flow Charts

• Use Cases

• Business rules

• Entity-Relationship Diagrams

• Activity Diagrams

Copyright © 2004-13, SQM, LLC.

Slide 80

State Transition Table Model ExampleInitial State Event Result New State

S1 E1 <none> S1

E2 logged in S2

E3 SU log in S3

S2 E4 … S4

E5 <none> S2

E6 logged out Exit

S3 E4 … S4

E5 admin S3

E6 logged out Exit

S1

S3

S2

Exit

S4

E1

E2

E3E4

E5

E6

E4

E5

E6

Douglas Hoffman Copyright © 2004-13, SQM, LLC.

Page 41: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 41

Douglas Hoffman 81

‘Model Based’ Oracle In Context

• May be useful:− When the model is simple− When the model is well defined (e.g., state

machine or screen hierarchy descriptions)− Events can be reliably generated− Best if states can be monitored or detected

• Usually avoided:− No identifiable models− Unstable, poorly defined, or complex models− Events are difficult to generate or manage− State of the program is not easily determined

Copyright © 2004-13, SQM, LLC.

Exploratory Automated Testing

Belgium Testing DaysFebruary 27, 2013

Tutorial Part 2

Douglas Hoffman, BACS, MBA, MSEE,

ASQ-CSQE, ASQ-CMQ/OE, ASQ Fellow

Software Quality Methods, LLC. (SQM)

www.SoftwareQualityMethods.com

[email protected]

Douglas Hoffman Copyright © 2004-13, SQM, LLC. 82

Page 42: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 42

Douglas Hoffman 83

‘Constraint-Based’ Oracle

• Definition:

− Check for simple valid (or invalid) individual or

combinations of data values or characteristics

• Examples:

− USA Zip Codes are 5 or 9 digits, Canadian Zip

Codes are 6 alpha-numeric values

− Employee Start Date is after Employee Birth Date

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 84

‘Constraint-Based’ Oracle

• Advantages:

− Low cost checks for some error conditions

− Provides both test ideas and oracles

• Disadvantages:

− May be code-invasive

− Constraint conditions have to be recognized

Copyright © 2004-13, SQM, LLC.

Page 43: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 43

Douglas Hoffman 85

‘Constraint-Based’ Strategy

• Identify checkable secondary characteristics− Size, form, type, illegal values, etc.

− Values that constrain one another

− Invariant rules

− Coincident (not causal) values

• Checking − Create checking mechanism to confirm

conformance with the constraints

− Synchronous or asynchronous checking

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 86

‘Constraint-Based’ In Context

• May be useful:− Recognizable relationships

− Straightforward to check the relationships

− Potentially quick verification of large data

sets

• Usually avoided:− No recognizable relationships of value

− Getting values is difficult

− Checking is overly complex

Copyright © 2004-13, SQM, LLC.

Page 44: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 44

Douglas Hoffman 87

‘Probabilistic’ Oracle

• Definition:− Use an approximation, heuristic (rule of thumb),

or partial information that supports but does not

necessarily mandate a given conclusion

• Examples:− Employee is usually older than their dependents

− This test should take more than 1/3 the time and

less than 3 times the time it ran last

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 88

‘Probabilistic’ Oracle

• Advantages:

− Can provide a quick ‘sanity check’ on large data

sets

− May be inexpensive to implement

• Disadvantages:

− Misses or False Alarms may be expensive

− Nothing found ≠ No problem

Copyright © 2004-13, SQM, LLC.

Page 45: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 45

Douglas Hoffman 89

‘Probabilistic’ Oracle Strategy

• Probabilistic opportunities

− Relationships that are usually true

− Very unlikely conditions

− Approximation available

• Checking

− Create checking mechanism to confirm

relationship or condition

− Synchronous or asynchronous checking

Copyright © 2004-13, SQM, LLC.

Some Useful Oracle Heuristics

• Similar results that don’t always work

• Less exact computations (32 bits instead of 64)

• Look for subsets

• Any relationships not explicit in SUT– Date-time/transaction number

– One home address

– Timings

• Look for general characteristics

• Harmonic or repeating patterns

Douglas Hoffman 90Copyright © 2004-13, SQM, LLC.

Page 46: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 46

Douglas Hoffman 91

‘Probabilistic’ Oracle In Context

• May be useful:− High speed of comparison is important

− Exact results are difficult to generate

− Few expected exception cases (false alarms)

− Simple heuristic is available

• Usually avoided:− Used as the only oracle

− Heuristic is too complex

− More exact checks are required

− Too many expected exceptionsCopyright © 2004-13, SQM, LLC.

Douglas Hoffman 92

‘Statistical’ Oracle

• Definition:− Compare the statistical properties of the inputs

and results to confirm similarity

• Examples:− The properties of the input distribution file

records is similar to the output distribution

− Today’s pattern of use is similar to yesterday’s

− Test of random communication data packets by

checking statistical properties of the packets

Copyright © 2004-13, SQM, LLC.

Page 47: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 47

Douglas Hoffman 93

‘Statistical’ Oracle

• Advantages:− Can check on a high volume of well-formed data

− May be inexpensive to implement

− Can be used with live data

• Disadvantages:− Only useful with large populations

− Result properties must be described by some

transformation of the input properties

− Inputs require reasonably computable properties

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 94

‘Statistical’ Oracle Strategy

• Principle idea:– Useful for high-volume data impractical to otherwise verify

– Use high-volume random tests generated with particular

statistical properties

– Results checking based on population statistical properties

• The oracle:

– Computes the statistical properties from the inputs

– Computes the statistical properties from the outcomes

– Compares the statistics in light of the expected

transformation through the SUT

Copyright © 2004-13, SQM, LLC.

Page 48: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 48

Douglas Hoffman 95

‘Statistical’ Oracle In Context

• May be useful:− Inputs and outcomes can be statistically counted

− Strong correlation between input and outcome population statistics

− Population statistics are easily computable

• Usually avoided:− Used as the only oracle

− Data is not countable for statistical profiling

− Input/outcome populations are not statistically related

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 96

‘Property-Based’ Oracle

• Definition:− Compare correlated but not causal relationships

between variables

• Examples:− Sales Order Number should be in time-sequence

order

− Checking the number of pages printed by a test

Copyright © 2004-13, SQM, LLC.

Page 49: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 49

Douglas Hoffman 97

‘Property-Based’ Oracle

• Advantages:

− Can independently double-check data consistency

− Can be used with live data

• Disadvantages:

− Property has to exist and be recognized

− Property has to be easily testable

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 98

‘Property-Based’ Oracle Strategy

• Principle idea:

– Check secondary or coincidental characteristics

– Most are ‘sanity checks’

• The oracle:

– Checks for conformance with expected

characteristics

– May miss obvious errors

Copyright © 2004-13, SQM, LLC.

Page 50: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 50

Douglas Hoffman 99

‘Property-Based’ Oracle In Context

• May be useful:

− Recognizable secondary or coincidental

characteristics

− In-test checking for obvious errors

− Simple checks may identify significant problems

• Usually avoided:

− Unimportant characteristics

− High expectation of false alarms

− Difficult or invasive checking

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 100

‘Diagnostic’ Oracle

• Definition:− Instrument the code (assertions or logging)

• Examples:− Code assertions (e.g., must have a valid UID

to be logged in)

− Log files generated by SUT

Copyright © 2004-13, SQM, LLC.

Page 51: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 51

Douglas Hoffman 101

‘Diagnostic’ Oracle

• Advantages:

− Can test for invisible conditions

− Can be tailored for specific types of errors

• Disadvantages:

− Code invasive

− May be difficult to interpret the logs

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 102

‘Diagnostic’ Oracle Strategy

• Principle idea:

– Embed checks or logs in the code

– Use high-volume random tests to create large number of

conditions

– Errors flagged by asserts or analysis of log files

• The oracle:

– Code assertions to flag “impossible” conditions

– Inconsistencies identifiable in the log files

Copyright © 2004-13, SQM, LLC.

Page 52: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 52

Douglas Hoffman 103

‘Diagnostic’ Oracle In Context

• May be useful:− Code available for modifications

− Close working relationship with developers

− Code changes do not significantly change program behavior

• Usually avoided:− Uncooperative developers

− Program is timing dependent

− Changes cause program behavior to significantly change

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 104

‘Hand-Crafted’ Oracle

• Definition:− Select inputs for ease of generating expected results

− Select or compute complex result and corresponding

inputs

• Examples:− Construct a drawing with 8 overlapping layers

containing different figures, location, and fill …

− Sub-atomic particle A with mass X and velocity Y is

hit by particle B with mass W and velocity Z,

resulting in generation of particles C and D …

Copyright © 2004-13, SQM, LLC.

Page 53: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 53

Douglas Hoffman 105

‘Hand-Crafted’ Oracle

• Advantages:

− Can test for specific conditions (e.g., boundaries)

− Can provide oracles for arbitrarily complex SUTs

− Oracle is often built into the test

• Disadvantages:

− One-off cases

− Has a limited ability to detect errors

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 106

‘Hand-Crafted’ Oracle Strategy

• Expected result is carefully crafted (or

selected) with the input values

• Input and result are specified together

• Oracle is frequently built into the test

• The approach is most often taken for

regression tests in complex SUTs

Copyright © 2004-13, SQM, LLC.

Page 54: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 54

Douglas Hoffman 107

‘Hand Crafted’ Oracle In Context

• May be useful:

− Complex function

− Special cases are easily identified

• Usually avoided:

− Outcomes extremely difficult or time consuming to predict

− Other oracles are available

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 108

‘Human’ Oracle

• Definition:

− Person observes SUT behavior to detect possible

problems

• Examples:

− Manual GUI testing

Note that a human oracle is applied whenever any

other oracle flags a potential bug

Copyright © 2004-13, SQM, LLC.

Page 55: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 55

Douglas Hoffman 109

‘Human’ Oracle

• Advantages:− Extremely flexible to changes

− Tests are multi-dimensional (using all senses)

− Can notice potential errors on many levels at once

(e.g., placement, font, content, navigation)

• Disadvantages:− Human speeds for computation

− Cannot do exact comparisons on large data sets

− Easily trained (e.g., inattentional blindness)

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 110

‘Human’ Oracle Strategy

• Set a person in front of the SUT to observe

• Human uses their judgment to decide the

verdict

• Works for manual or automated exercises

• Works for scripted or unscripted tests

Note that a human oracle is applied whenever

any other oracle identifies a potential bug

Copyright © 2004-13, SQM, LLC.

Page 56: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 56

Douglas Hoffman 111

‘Human’ Oracle In Context

• May be useful:

− Outcome medium is not machine readable

− Input/outcome relationship is very complex

− Insufficient time to automate the oracle

• Usually avoided:

− High volume of comparisons

− Very repetitive checking

− High level of detail or specificity required

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 112

Software Test Automation:

Test Comparators

Copyright © 2004-13, SQM, LLC.

Page 57: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 57

Douglas Hoffman 113

Evaluation Mechanisms

• Deterministic– The comparator (or oracle) accepts the inputs and/or the

test outcomes to compare for a match between expected

and actual behaviors

– If they do not match, outcome is wrong and

investigation required

• Non-Deterministic– The comparator (or oracle) accepts the inputs and/or the

test outputs and evaluates whether the results are

plausible

– If the outcomes are implausible or not close enough,

investigation is required

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 114

Deterministic Evaluation

Comparison for data set equivalence

• Saved result from a previous run

• Parallel generation

– previous version

– competitor’s product

– alternate implementation

– custom oracle

Copyright © 2004-13, SQM, LLC.

Page 58: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 58

Douglas Hoffman 115

Other Deterministic Evaluation

Comparison for outcome consistency

• Inverse function– mathematical inverse

– operational inverse

• Useful functional rules– Deterministic incidental or informative attributes

• Expected result embedded in the data– self-descriptive (blue)– embedded key (CRC, seed)

– cyclic data

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 116

Non-Deterministic Evaluation

Approximate comparisons or similarities

• Compare insufficient attributes

– use 16 bit functions to check 64 bit functions

– testable pattern for a set of values for a variable

– one or a few primary attributes of outcomes

• Statistical distributions

– test for outliers, means, predicted distribution

– statistical properties (predicted population statistics)

– comparison of correlated variables’ populations

Copyright © 2004-13, SQM, LLC.

Page 59: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 59

Douglas Hoffman 117

Non-Deterministic Evaluation

Approximate comparisons

• Approximate models– At the end of the exercise Z should be true

– X is usually greater than Y

– precondition = postcondition

• “Fuzzy” comparisons– within a range of values

– bitmaps

– statistical analysis (likely properties)

– shading

– CRC type summaries

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 118

Non-Deterministic Evaluation

Similarities

• Incidental or informative attributes– correlations (time of day and sales order number)

– relative duration (e.g., within factor of x/3 to 3x)

– likely sequences

– size or shape (number of digits/characters)

– uniqueness of SSN

• Reordered sets (where order matters)– itemized list of sales items (taxable and nontaxable)

– reorder asynchronous events for comparison

Copyright © 2004-13, SQM, LLC.

Page 60: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 60

Douglas Hoffman 119

Software Test Automation:

Verdict Analysis

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 120

Basic Oracle Model

Test Oracle

System

Under

Test

Verdict

Copyright © 2004-13, SQM, LLC.

Test Inputs

Input Data

Program State

Environmental

Inputs

Test Results

Postcondition

Data

Postcondition

Program State

Environmental

Results

Page 61: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 61

Douglas Hoffman 121

‘Other Implementation’ Model

Test Results

Postcondition

Data

Postcondition

Program State

Environmental

Results

Test

Oracle

System

Under

Test

Test Inputs

Input Data

Program State

Environmental

Inputs

Compare

Verdict

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 122

‘No’ Oracle Model

Test Results

Postcondition

Data

Postcondition

Program State

Environmental

Results

System

Under

Test

Test Inputs

Input Data

Program State

Environmental

Inputs

Compare

Verdict

Copyright © 2004-13, SQM, LLC.

Page 62: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 62

Douglas Hoffman 123

‘Saved Master’ Model

Test Results

Postcondition

Data

Postcondition

Program State

Environmental

Results

System

Under

Test

Test Inputs

Input Data

Program State

Environmental

Inputs

Verdict

Previous

Results

Compare

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 124

‘Function Equivalence’ Model

System

Under

Test

Copyright © 2004-13, SQM, LLC.

Verdict

Test OracleTest Inputs

Input Data

Program State

Environmental

Inputs

Test Results

Postcondition

Data

Postcondition

Program State

Environmental

Results

Page 63: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 63

Douglas Hoffman 125

‘Self-Verifying Data’ Oracle Model

Test

Oracle

System

Under

Test

Verdict

Copyright © 2004-13, SQM, LLC.

Test Inputs

Input Data

Program State

Environmental

Inputs

Test Results

Postcondition

Data

Postcondition

Program State

Environmental

Results

Douglas Hoffman 126

‘Model Based’ Oracle Model

System

Under

Test

Model

of SUT

Copyright © 2004-13, SQM, LLC.

Verdict

Test Oracle

Test Inputs

Input Data

Program State

Environmental

Inputs

Test Results

Postcondition

Data

Postcondition

Program State

Environmental

Results

Page 64: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 64

Douglas Hoffman 127

Software Test Automation:

Exploratory Automated Testing

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 128

High Volume Random Tests

• Principle idea

– High-volume testing using varied inputs

– Results checking based on individual results or population’s

statistical characteristics

• Fundamental goal is to have a huge numbers of tests

– The individual tests may not be not all that powerful or

compelling

– Input is varied for each step

– Individual results may not be checked for correctness –

sometimes population statistics are used

– The power of the approach lies in the large number of tests

Copyright © 2004-13, SQM, LLC.

Page 65: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 65

Douglas Hoffman 129

Low Volume Exploratory Tests

• Principle idea

– One-at-a-time testing using varied inputs

– Use automation to make exploration easy

• Fundamental goal is to enable exploration

– Variations on a theme (modify automated tests)

– Quick-and-dirty generation of tests/data/comparisons

– Background checking

• Memory leak detection

• File modification

• Etc.

– Command line or UI based variations

Copyright © 2004-13, SQM, LLC.

The Easy Part

I said before that driving the inputs or

generating data is usually straightforward

– Random number generators can allow us to

create tests that do new things all the time

– Using well-formed expressions it is possible to

create arbitrary inputs and data sets

– The oracles are fundamental to deciding test

design and methodology

Douglas Hoffman Copyright © 2004-13, SQM, LLC. Slide 130

Page 66: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 66

The Catch for the Easy Part

Although driving the inputs may be

straightforward, selecting the interface points

is more difficult

– Requires understanding of the architecture and

protocols to identify touch points

– Identifying the valuable of the touch points is

critical

– Checking for unexpected behavior is the key

to success

Douglas Hoffman Copyright © 2004-13, SQM, LLC. Slide 131

Douglas Hoffman 132

Random Number Generators

• Randomized input values

• Randomized data generation

• Random test selection

• Pseudo-Random numbers

• Using seed values

• Generating random seeds

Copyright © 2004-13, SQM, LLC.

Page 67: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 67

Douglas Hoffman 133

Random Number Seeds

Computer random numbers aren’t really

random – they’re pseudo-random

– Long series of statistically random values

that repeats

– By default it begins randomly somewhere in

the series

– A seed value specifies where to begin in the

series so it repeats

Copyright © 2004-13, SQM, LLC.

Repeatable Random Series# RUBY code

MAX_SEED = 1_000_000_000

def initial_RNG_seed(myseed)

if (myseed == nil) # Check if seed is provided

# Create a random number to seed RNG

puts "(no seed passed in, so generate one)"

myseed = srand()

myseed = rand(MAX_SEED)

end

# print the seed so that we know the seed used

puts "myseed is #{myseed.to_s}\n"

foo2 = srand (myseed) # initialize the RNG

foo = rand() # generate the [first] random number

return foo

end

Douglas Hoffman 134Copyright © 2012-13, SQM, LLC.

RNG

Page 68: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 68

Random Series Output Example

puts ("First run: #{initial_RNG_seed(nil)} \n \n")

puts ("Second run: #{initial_RNG_seed(400)} \n \n")

puts ("Third run: #{initial_RNG_seed(nil)} \n")

(no seed passed in, so generate one)

myseed is 144288918

First run: 0.3705579466087263

myseed is 400

Second run: 0.6687289088341747

(no seed passed in, so generate one)

myseed is 108495905

Third run: 0.09838898989988143

Douglas Hoffman Copyright © 2012-13, SQM, LLC. 135

Douglas Hoffman 136

Well-Formed Values

• Input values and data have well-defined

grammars (they’re formatted)

• Generating arbitrary values is a matter of

applying the grammar

• Seed values with the random number

generator allow us to repeat the random

series and regenerate the inputs

Copyright © 2004-13, SQM, LLC.

Page 69: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 69

Well-Formed Data#RUBY program that generates simple arithmetic phrases

def eq_gen

jubilee = 1 + rand(8) # 8 choices for the 4 defined cases

case jubilee

when 1

"(" << eq_gen() << ") + (" << eq_gen() << ")"

when 2

"(" << eq_gen() << ") - (" << eq_gen() << ")"

when 3

"(" << eq_gen() << ") * (" << eq_gen() << ")"

when 4

"(" << eq_gen() << ") / (" << eq_gen() << ")"

else # generate number half the time

rand(100).to_s

end

end

Douglas Hoffman 137Copyright © 2012-13, SQM, LLC.

Well Formed

Example of Well-Formed Dataputs eq_gen.to_s

puts eq_gen.to_s

puts eq_gen.to_s

puts eq_gen.to_s

(77) - ((62) / (6))

((10) - (40)) + (67)

53

(62) - ((96) * ((((77) - (72)) - ((7) * ((47) - (91)))) /

((34) + (((70) - (18)) + (4)))))

Douglas Hoffman 138Copyright © 2012-13, SQM, LLC.

Page 70: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 70

Douglas Hoffman Copyright © 2012-13, SQM, LLC. 139

“Sandboxed” Tests

A sandboxed test gets along with itself

• Self-contained regression test

• Independent of other tests

• Can be automatically launched

• Runs to completion

• Ends without resetting SUT or system

• Can successfully be run repeatedly (e.g., 200x)

They’re used by randomly selecting and

running from among a set of them

Random Sandboxed Tests#RUBY program that runs executables in separate threads

# The numeric display routine courtesy of Nawwar Kabbani

threads = []

list = []

list = Dir.entries ("./executables") # directory with tests

list3 = list.reject { |s| s =~ /^\./ } # exclude ./ and ../

TestCount = list3.length

for y in (1..10) # launch 10 random executions from the files

in the directory

name = list3[rand(TestCount)] # randomly select the test

threads << Thread.new(name) do |x| # generate new thread

with selected test

puts "\n running: "

puts x.to_s

system (‘ruby ./executable/’ + x) # run the program (ruby)

end

threads.each {|thr| thr.join} # wait for the thread

end

puts "\n Exiting“

Douglas Hoffman 140Copyright © 2012-13, SQM, LLC.

rand regress

Page 71: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 71

Douglas Hoffman 141

Software Test Automation:

Examples of Oracles

Copyright © 2004-13, SQM, LLC.

‘No Oracle’ Example

Test functions through an API:

• Select a random function and use random

parameter values

• Run the test exercise

• Repeat without checking ‘results’

• Watch for hangs or crashes

Douglas Hoffman 142Copyright © 2004-13, SQM, LLC.

Page 72: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 72

‘No Oracle’ Example

Test random character input to a word processor:

• Generate a random input character

• Run the test exercise

• Repeat without checking results

• Watch for hangs or crashes

Douglas Hoffman 143Copyright © 2004-13, SQM, LLC.

‘Independent Implementation’

ExampleCreate a independent sine (x) function oracle:

• Determine the algorithm used by the SUT

• Implement a separate sine (x) function using a

different algorithm

• Generate random values for x

• The values returned from both sine (x) functions

should match

Douglas Hoffman 144Copyright © 2004-13, SQM, LLC.

Page 73: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 73

‘Consistency’ Oracle

There are two approaches for consistency

testing:

1. “Golden Master”

– Saved results from an earlier run

– Validated or unvalidated

2. Function equivalence

– Competitive product

– Alternate implementation

Douglas Hoffman 145Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 146

‘Golden Master’ Approach

• Checks for changes or differences

• Most frequently used strategy

• Most common method:

− Create a test with logging to a disc file

− Run the test and check the log carefully

− Keep the log as a [golden] master

− Compare subsequent runs against the log

− Update the master when legacy bugs are fixed

Copyright © 2004-13, SQM, LLC.

Page 74: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 74

Douglas Hoffman Copyright © 2012, SQM, LLC. 147

‘Function Equivalence’ Tests

Passing input values to two products for

comparison in real-time

• Competitive product, alternate implementation,

earlier version, different platforms, etc.

• Randomly generated, well-formed inputs (or not)

• Use the same inputs for both versions

• Test the outcomes for “equivalency”

‘Function Equivalence’ Example

Create a function equivalence test for

spreadsheet math functions:

• Identify functions of interest and the parameter

definitions

• Create a test exercise that creates random, well

formed function calls with parameters

• Feed the calls to OOSpreadsheet and Excel

• Compare the returned results

Douglas Hoffman 148Copyright © 2004-13, SQM, LLC.

Page 75: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 75

‘Self-Descriptive’ SVD Examples

• Describe the expected result in the data

– The name of a font (e.g., Comic Sans MS)

– Color (e.g., Blue)

– Size (e.g., 36 point)

– The following line should be “xyz”

Douglas Hoffman 149Copyright © 2004-13, SQM, LLC.

‘Cyclic Algorithm’ SVD Example

1. Use a pattern when the data is generated

– E.g., start, increment, count

– E.g., basic string, count of iterations

2. Identify the pattern in the result

3. Confirm that the actual pattern is expected

– Build into comparator

– Embed with the data

Douglas Hoffman 150Copyright © 2004-13, SQM, LLC.

Page 76: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 76

‘Tagged Data’ SVD Example

• Generate a value that summarizes the data

such as a cyclic redundancy check:1. Use the first value as the starting number for

the CRC

2. Shift the CRC one place

3. Add the next value

4. Repeat steps 2 and 3 until all values have been

included in the CRC computation

5. Append the computed CRC to the data

Douglas Hoffman 151Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 152

‘Embedded Key’ SVD Example

• For a 128 character name field:− Use seeded RNG to generate name (up to 120

characters)

− Convert the seed to an 8 character value

− Append the seed to the name

− Regenerate the name using the seed and algorithm

Copyright © 2004-13, SQM, LLC.

Name(128) = … L Random characters … 8 character S

9 to 128 characters long

Page 77: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 77

‘Shared Keys’ SVD Example

Create a random name:

1. Generate and save random number seed (S) and

convert to a string

2. Use the first random value using RAND(S) as

the Length (L)

3. Generate random name (N) with L characters

using RAND()

4. Concatenate the seed to the name

Douglas Hoffman 153Copyright © 2004-13, SQM, LLC.

‘Model-Based’ Oracle Example

• For a state machine (such as web page

transitions)

– Describe the state transition table in machine

readable form

– Design a test exercise applying the table

(e.g., walk all transitions, random walk, or

generate illegal events)

– Verify transitions within the test by sensing

the state

Douglas Hoffman 154Copyright © 2004-13, SQM, LLC.

Page 78: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 78

‘Constraint-Based’ Oracle

Examples

Employee data:

• Employee birth date is before start date

• Each employee has a unique SSN

• Active employee start dates are before today

• At most one current spouse

• Only one record for an employee

Douglas Hoffman 155Copyright © 2004-13, SQM, LLC.

‘Probabilistic’ Oracle Examples

Employee database integrity check:

1. Employee older than dependents

2. Employee start date after company established

3. Employee age greater than 15

4. Zip code (if USA) 5 or 9 digits

5. USA SSN has 9 digits

6. No duplicate SSN entries

Douglas Hoffman 156Copyright © 2004-13, SQM, LLC.

Page 79: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 79

‘Statistical’ Oracle Example

Computing the right state taxes:

• Generate random items, quantities, locations,

etc. for purchase transactions

• Track total amounts of purchases and taxes

collected

• Compute average tax across all locations

• (total taxes collected) should equal (total

purchases * average taxes)

Douglas Hoffman 157Copyright © 2004-13, SQM, LLC.

‘Property-Based’ Oracle

Example

Sales Order Number and Date/Time Stamp:

• Run an SQL query for all records sorted by

Sales Order Number

• Run an SQL query for all records sorted by

Date/Time of Creation

• The two sets of records should be the same

Douglas Hoffman 158Copyright © 2004-13, SQM, LLC.

Page 80: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 80

‘Computational’ Oracle Example

Computing the square root:

• Generate a random input value (x)

• Compute y = square root (x)

• Compute z = y2

• Check x = z

Douglas Hoffman 159Copyright © 2004-13, SQM, LLC.

‘Computational’ Oracle Example

Splitting a table:

• Generate and populate a table

• Select some row in the table

• Split the table

• The oracle deletes the split between the two

tables

• Check the original and final tables are the same

Douglas Hoffman 160Copyright © 2004-13, SQM, LLC.

Page 81: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 81

‘Diagnostic’ Oracle Example

Long random walk:

• Put assert statements in the code for invalid or

unusual circumstance

• Create records of significant data for diagnostic

purposes when assertion is violated

• Generate a large number of valid inputs to

exercise the code

Douglas Hoffman 161Copyright © 2004-13, SQM, LLC.

‘Diagnostic’ Oracle Example

Long random walk:

• Put logging statements in the code to record

internal state and value information

• Run exploratory test

• Evaluate logs after testing for unexpected

circumstances

Douglas Hoffman 162Copyright © 2004-13, SQM, LLC.

Page 82: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 82

‘Hand-Crafted’ Oracle Example

Sales order processing:

• Choose specific values for order information

(e.g., boundary cases)

• Identify what the order should look like and

how the SUT should react

• Verify actual vs. expected when the test is run

Douglas Hoffman 163Copyright © 2004-13, SQM, LLC.

‘Human’ Oracle Example

Calendar generation program testing:

• Choose many sets of input values describing a

calendar

• Print the input values and corresponding

generated calendar

• Review the calendar to confirm appropriate

calendar or error indication output

Douglas Hoffman 164Copyright © 2004-13, SQM, LLC.

Page 83: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 83

Douglas Hoffman 165

Software Test Automation:

Some Parting Words about the

Design of Automated Tests

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 166

Exploratory Test Automation

• Extends our reach

• Augments human capabilities

• Does something different every time

• Is heavily dependent on oracles

• May access internal information

• Can surface bugs that we didn’t consider

• May check invariants

Copyright © 2004-13, SQM, LLC.

Page 84: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 84

Douglas Hoffman 167

Good Automated Tests

�Start with a known state

�Build variation into the tests

�Plan for the capture of data on error

�Check for errors during the test run

�Capture information when an error is noticed

�Minimize error masking and cascading

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 168

Start With a Known State • Data

– Load preset values in advance of testing

– Reduce dependencies on other tests

• Program State

– External view

– Internal state variables

• Environment

– Decide on desired controlled configuration

– Capture relevant session information

Copyright © 2004-13, SQM, LLC.

Page 85: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 85

Douglas Hoffman 169

Build Variation Into the Tests

• Dumb monkeys

• Data driven tests

• Pseudo-random event generation

• Model driven and model based automation

• Variations on a theme

• Configuration variables

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 170

Plan For Capture of Data

• Know what data may be important to identify and

fix errors

• When necessary, capture prospective diagnostic

information before an error is detected

• Include information beyond inputs and outcomes

• Check as many things as possible

• Design tests to minimize changes after errors occur

Copyright © 2004-13, SQM, LLC.

Page 86: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 86

Douglas Hoffman 171

Check for Errors

• Periodically check for errors as the test runs

• Document expectations in the tests

• Capture prospective diagnostic information before an

error is detected

• Capture information when the error is found (don’t wait)

– Results

– Other domains

– “Dump the world”

• Check as many things as possible

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 172

Capture Information

When An Error is Noticed• When something unexpected is detected: “dump the

world”

– All “interesting” information

– Let the users of the information determine what’s interesting

– Since the developers are typically the information users, enlist

their help to automate information gathering

OR

• Freeze – wait for a person

– When “interesting” information is expensive to capture

– When there isn’t time to create automated capture routines

Copyright © 2004-13, SQM, LLC.

Page 87: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 87

Douglas Hoffman 173

Don’t Encourage

Error Masking or Error Cascading

• Session runs a series of tests

• A test does not run to normal completion

– Error masking occurs if testing stops

– Error cascading occurs if one or more downstream tests fails as a consequence of this test failing

• Impossible to avoid altogether

• Should not design automated tests that unnecessarily cause either situation

Copyright © 2004-13, SQM, LLC.

Douglas Hoffman 174

Recapping

�Use automation to explore (not just repeat)

�Extend your reach into the system

�Think of models for test execution

�Automate based on available oracles

�Use different types of oracles

�Design good automated tests

Copyright © 2004-13, SQM, LLC.

Page 88: February 27, 2013 Douglas Hoffman, BACS, MBA, MSEE, ASQ ... · – Past Chair, Silicon Valley Section, American Society for Quality (ASQ) – Founding Member and Current Chair, Santa

Belgium Testing Days 2013

Exploratory Automated Tests

(Tutorial)

February 27, 2013

Page 88

Douglas Hoffman Copyright © 2004-13, SQM, LLC. Slide 175