21-25 Feb 2001SIGCSE 2001 Integrating Testing into the Curriculum – Arsenic in Small Doses Edward...

46
21-25 Feb 2001 SIGCSE 2001 Integrating Testing into the Curriculum – Arsenic in Small Doses Edward L. Jones CIS Department Florida A&M University Tallahassee, FL USA

Transcript of 21-25 Feb 2001SIGCSE 2001 Integrating Testing into the Curriculum – Arsenic in Small Doses Edward...

21-25 Feb 2001 SIGCSE 2001

Integrating Testing into the Curriculum – Arsenic in

Small Doses

Edward L. JonesCIS Department

Florida A&M UniversityTallahassee, FL USA

21-25 Feb 2001 SIGCSE 2001

Lessons from ACE2000

• Practice (10 slides)

• SPRAE framework, carefully explained

• Arsenic pills:

– 7 things to do

– Automated grading

– Student mentoring

• Tester certification in the experience factory

• On-going work

21-25 Feb 2001 SIGCSE 2001

Motivation

• Industry need

• Testing experience adds value

• Avoid isolationist view – confine to course

• Salt courses with test related experiences

• Provide opportunity for advanced study

• Impact our curriculum

21-25 Feb 2001 SIGCSE 2001

Possible Approaches

• Teach a course in software testing

• Expose student work to rigorous testing

• Train selected students

• Opportunistic insertion of testing into existing courses

• Formal training environment

21-25 Feb 2001 SIGCSE 2001

The Holistic Approach

Software Test Lab

Core Curriculum

Elective Testing Course

Testing In Action (Automated Grading)

SPRAE Testing Framework

21-25 Feb 2001 SIGCSE 2001

What is Meant by Holistic

• Unifying framework – what’s important

• Experience-based

• Experiences aligned to framework

• Multiple experiences in different contexts

• Multiple means of delivery

21-25 Feb 2001 SIGCSE 2001

The SPRAE Framework

SSpecification – the basis for testing

PPremeditation – a systematic process

RRepeatability – tester independence

AAccountability – documented process, results

EEconomy – cost-effective practices

21-25 Feb 2001 SIGCSE 2001

Test Life Cycle

Analysis

Design

Implementation

Execution

Evaluation

Specification

Test Plan

Test Script, Data, Driver

Defect DataProblem Reports

Test Results, Log

Test Cases

21-25 Feb 2001 SIGCSE 2001

The Software Testing Course

• 80% practice, 20% theory

• 2/3 fine-grained testing (functions)

• 1/3 object and application testing

• Test cases, drivers, and scripts

• Decision tables the "formalism" of choice

• Functional, boundary, white-box testing

• Evaluation via coverage & error seeding

21-25 Feb 2001 SIGCSE 2001

Course -- Lessons Learned

• Advantages– In-depth, continuous concept treatment – Complement to other software skills – Basis for future advanced work

• Deficiencies– Why Johnny can’t test? Programming skills – Not a mainstream course available to all– Students compartmentalize course content

21-25 Feb 2001 SIGCSE 2001

Testing in Action – Automated Program

Grading

Prepare Assignment Implement Grader Grade Programs

Write Program

Grading Log

Grade Report

Student ProgramAssignment

Specification

Assignment Specification

Test plan

Test cases

Test driver

Checker script

Automated Grader

Overhead

21-25 Feb 2001 SIGCSE 2001

Results -- Automated Program Grading

• Student shock & outrage at exactness• Behavior modification – tester mindset • Extra work on teacher

– Specification must be better than average!– Practice what you preach (test process)– Cost amortized via similar assignment styles

• Is grader too strict for CS1/CS2?

• Selling colleagues on the idea!

21-25 Feb 2001 SIGCSE 2001

Opportunistic Insertion into Existing Courses

• Risk of diluting course content• Hard to transfer to colleagues (comfort

zone with subject matter) • Value-Added approach

– Testing strengthens other skills– Testing brings objectivity to student

• Some examples

21-25 Feb 2001 SIGCSE 2001

Simple Things to Try

• Grade another student’s program and justify grade -- in writing.

• (Group) Develop test cases before writing program.

• Treasure Hunt. Find seeded errors, document process followed, give evidence of fix.

• Develop and sell certified components to be used in subsequent assignments.

• Blind testing. Write specification from executable.

21-25 Feb 2001 SIGCSE 2001

Mentoring Students - Experiences

• Hand picked students -- paid• Series of testing projects• Skills taught as needed• Student access to examples from courses, other

students• MANAGEMENT NIGHTHMARE!• Structured environment and process needed!

21-25 Feb 2001 SIGCSE 2001

The Software TestLab

• Environment for discovery & learning

• An evolving laboratory • Tools & Tutorials• Staff (students, faculty)• Test problem/solution test bed

• Students participate in the evolution

• Feedback on lab resources• Create/Refine resources

• Technology insertion into classroom

Vison

21-25 Feb 2001 SIGCSE 2001

TestLab -- The Big Picture

• Marketing • Interns• New Hires

Support

Curriculum

Students

SoftwareTestLab

ResearchPublications

Corporate Sponsors

21-25 Feb 2001 SIGCSE 2001

Student Mentorship Model • Manage skill development

• Set clear achievement goals

• Key Practices x Competency Levels

• Certify levels of progression

• Enable student-student mentoring

• Establish recognition program

Mentorship

21-25 Feb 2001 SIGCSE 2001

Key Practices

• Practitioner - performs defined test.

• Builder - constructs test “machinery”

• Designer - designs test cases.

• Analyst - determines test needs, strategy.

• Inspector - verifies correct process, results.

• Environmentalist - establishes & maintains the test environment.

• Specialist - performs entire test life cycle.

Mentorship

21-25 Feb 2001 SIGCSE 2001

Competency Levels

Practitioner

Test Designer

Test Analyst

Test Inspector

Test Environmentalist

Test SPECIALIST

1

Test Builder 1

1

1

1

1

1

2

2

2

2

2

2

2

3

3

3

3

3

3

3

4

4

4

4

4

4

4

5

5

5

5

5

5

5

Test Practitioner

Mentorship

21-25 Feb 2001 SIGCSE 2001

Test Specialist I

• Practitioner I - Run function test script & document test results.

• Builder I - Develop function test driver.

• Designer I - Develop functional and boundary test cases from decision table.

•Analyst I - Develop decision table from functional specification.

• Inspector I - Verify adherence to function test process and standards.

Mentorship

21-25 Feb 2001 SIGCSE 2001

Certification Testbed

• Repository of testing problems

• Repository of students’ test artifacts

• Best in class promoted to solutions testbed

• Deficient solutions used for future tester certification

Infrastructure

21-25 Feb 2001 SIGCSE 2001

Conclusions

• Testing must compete for air time with existing subject matter

• Opportunities to insert testing exist

• Testing can bring added value to course

• Need rigorous study of value-added hypothesis

• Biggest job may be selling colleagues

21-25 Feb 2001 SIGCSE 2001

On-Going & Future Work

• Evolve TestLab Mentorship Model – Experience Ladder & Certification– Evolving Problem/Solution Artifacts

• Careful study of value-added hypothesis

• Exploit automated grading – student mindset

• Disseminate Results

21-25 Feb 2001 SIGCSE 2001

Questions?

Questions?

Questions?

21-25 Feb 2001 SIGCSE 2001

Thank You

21-25 Feb 2001 SIGCSE 2001

Training Sequence (1)

• TestLab Environment Basic Training

• Unix basics

• C++/language refresher

• Encapsulation of function under test

• Repositories

Infrastructure

21-25 Feb 2001 SIGCSE 2001

Training Sequence (2)

• Black-Box Function (unit) Testing• Specification• Stimulus-response test cases• Function Test driver (5 styles)• Test driver input/results files• Test script (set-up + perform)• Test log

• Decision tables• Functional (partition) test case design• Boundary test case design

Infrastructure

21-25 Feb 2001 SIGCSE 2001

Training Sequence (3)

• Black-Box Object Testing• Specification of object’s methods

• Analysis of method’s stimulus-response

• Test planning

• Test cases = method + stimulus + response

• Object Test driver

• Object Test driver input/results files

• Operational test scenarios

Infrastructure

21-25 Feb 2001 SIGCSE 2001

Training Sequence (4)

• White-Box Function Testing• Control flow graph basics

• Coverage criteria/measures

• Instrumentation for data collection

• Use of in-house coverage tools

• Coverage analysis

• White-box coverage during black-box test

• Supplemental white-box test cases

Infrastructure

21-25 Feb 2001 SIGCSE 2001

Training Sequence (5)

• Clear-Box Object Testing• Goal is to overcome information hiding

• Test windows into internal object state

• set_state ( … )

• get_state ( … )

• Test case • Stimulus: precondition + method-stimulus

• Response: method-response + postcondition

• Clear-box object test driver

• Clear-box test oracles

Infrastructure

21-25 Feb 2001 SIGCSE 2001

Training Sequence

• TestLab Environment Basic Training

• Black-Box Function (unit) Testing

• Black-Box Object Testing

• White-Box Function Testing

• Clear-Box Object Testing

Infrastructure

21-25 Feb 2001 SIGCSE 2001

TestLab Infrastructure

• SPRAE Framework / Lifecycle

• Software Testing Course

• Training Sequence

• Standards & Techniques

• Student Mentorship Model

• Problem & Solution Testbeds

Status

21-25 Feb 2001 SIGCSE 2001

Standard Products

• Specification (narrative, semiformal)

• Decision Table

• Test Plan

• Test Script

• Test Driver

• Test Driver Input File

• Test Results File

• Test Log

Infrastructure

21-25 Feb 2001 SIGCSE 2001

Techniques

• Functional Equivalence Partitioning

• Boundary Value Analysis

• Function Encapsulation

• Control Flow Analysis

• Error-seeding for tester certification

Infrastructure

21-25 Feb 2001 SIGCSE 2001

What Makes It Holistic?

• Testing an integral part of curriculum

• Goal is multiple test experiences

• At least one experience in each course

• Repetition and reinforcement

• Accumulation of experiences

• Coverage of test lifecycle

21-25 Feb 2001 SIGCSE 2001

Course Outline

1 Course Overview2 Software Quality Assurance3 The Practice of Testing4 Specification-Driven Testing5 Boundary Testing6 Measuring Test Effectiveness7 Testing Object-Oriented Software8 Application Testing9 Course Review & Wrap-Up

21-25 Feb 2001 SIGCSE 2001

Automation Issues

• Does the teacher have the time?

• Is grader too strict for CS1/CS2?

• Additional automation to lower cost.

• The trap: “just a little more automation”

• Selling colleagues on the idea!

21-25 Feb 2001 SIGCSE 2001

Outline

• The Holistic Approach

• The SPRAE Testing Framework

• The Software Testing Course

• Automated Program Grading

• Opportunistic Insertion

• The Software Test Lab

• Conclusions / Future Work

21-25 Feb 2001 SIGCSE 2001

Example - Pay (S)

Specification: Compute pay for an employee, given Hours worked and hourly pay Rate; overtime is 1.5 times hourly Rate, for Hours above 40.

Hours

Rate

PayComputePay

21-25 Feb 2001 SIGCSE 2001

Principle P

Premeditation: Use a systematic process to devise test cases based on the specification.

Our Technique:• Decision analysis -- identify behaviors• One test case per behavior• Determine expected result

21-25 Feb 2001 SIGCSE 2001

Example - PayTest Case Design: Decision Table. Columns identify behaviors to test.

Decision Table Behaviors

Hours > 40 Y N Conditions

Pay = Hours * Rate X Actions

Pay = 40 * Rate + 1.5 *Rate * (Hours – 40)

X

10 10 Rate

40 50 Hours

400 550 Expected Pay

Test Cases 1 2

21-25 Feb 2001 SIGCSE 2001

Principle R

Repeatability: Processes for test case creation and test execution must yield equivalent results, independently of the tester.

21-25 Feb 2001 SIGCSE 2001

Principle AAccountability: Keep records that document test process and artifacts.

Documentation answers:• What tests were planned?• Which tests were conducted?• Who did what testing, and when?• What were the results?• How were the results interpreted?

21-25 Feb 2001 SIGCSE 2001

Example - PayRepeatability/Accountability:

Test Script Test Log

User Action: Enter

ExpectedResult Test ID /Outcomes

Step Rate Hours Pay 01 021 10 40 400 P P2 10 50 550 F P

Error/Discrepancy LogTest Step Description

01 2 Result Pay=500. OT not calculated.

21-25 Feb 2001 SIGCSE 2001

Principle EEconomy: Test activities must not require excessive time or effort.

• Automation

• Test drivers (classical tool)

• Simplified processes for

• Test case generation

• Data collection