Ingegneria del Software II

Post on 09-May-2015

380 views 1 download

Transcript of Ingegneria del Software II

Lecturer:Henry Muccini and Vittorio Cortellessa

Computer Science Department University of L'Aquila - Italy

muccini@di.univaq.it – cortelle@di.univaq.it[www.di.univaq.it/muccini] – [www.di.univaq.it/cortelle]

Course:

Ingegneria del Software IIacademic year: 2004-2005

Course Web-site: [www.di.univaq.it/ingegneria2/]

12.Regression Testing

2SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Copyright Notice

» The material in these slides may be freely reproduced and distributed, partially or totally, as far as an explicit reference or acknowledge to the material author is preserved.

Henry Muccini

3SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Acknowledgment

» This work is joined with the University of California, Irvine (Debra J. Richardson and M. Dias)

» Thanks to Lihua Xu and to the ROSATEA group at the University of California, Irvine for their contribution on this research

4SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Agenda

» Definition and Techniques

» Regression Test Selection Techniques

» Spec-based Regression Testing approaches

» SA-based Regression Testing

5SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

What is Regression Test

» Maintenance activity

» Performed on modified programs

» Attempts to validate modified software and ensure that modifications are correct

6SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

» Test modified software and provide a certain confidence that no new errors are introduced into previously tested code.

» Given program P and P’, T is a test suite for P. Aregression testing technique:

- provides a certain confidence that P’ is still correct with respect to a set of test T’, which is a subset of T;

- Helps to identify new test cases for T’.

» Different Techniques:

- Retest all

- Selective Retest (Regression Test Selection)

Regression Testing

7SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

» Select T’, subset of T and relevant for P’;

» Test P’ with respect to T’;

» If necessary, create T’’, to test new functionality/structure in P’;

» Test P’ with respect to T’’;

» Create T’’’, a new test suite and test history;

Regression Test Selection technique

8SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Traditional (Selective) Regression Testing

» Traditional Regression:

- We have a program P

- We select a set of tests T to be applied to P

- We test P with T

- P is modified in P’

- Select T’ in T, a set of tests to execute on P’

- Test P’ with T’

- If necessary, create T’’, a set of new tests for P’ and test P’ with T’’

XClient

ClientA ClientB

ClientC

P

Set of tests T

XClient, v2

ClientA, v2 ClientB, v2

ClientC, v2

P’

Set of tests T’, subset of T

9SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Selective Retest

1. Identify the modifications of P with P’.

2. Select T’, a subset of T, based on the result of step 1.

3. Test P’ with T’, establishing (P’)’s correctness with respect to T’.

4. If necessary, create a set of new tests T’’, to test the new, modified and untested portion of P’.

5. Test P’ with T’’, establishing (P’)’s correctness with respect to T’’.

6. The test suite for P’, thus, is T’ & T’’.

10SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

How they usually implement Regression testing» Given P, an instrumented version P1 is created to store

information on P’s execution

» P is (usually) represented as a flow graph

» Tests T are run over P1

» For each t in T, P1 stores information on the portion of code executed by running t (test history)

» P becomes P’

» The flow graph of P’ is built

» t in T will be in T’ if the portion of modified code affects t- that is, if t in the test history of P executes a portion of code

modified in P’, then t needs to be run again on P’

Provide an example

11SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Regression Test Selection Techniques

12SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

In other terms…

P code

Class x(){

m1, m2}

Class y {

m3 }

Class z {

m4}

TestSuite

Test History(obtained by code instrumentation)

The methods “reached” by executing the test cases

T1 = m1 + m3

T2 = m2+m3+m4

P’ code

Class x(){

m1’, m2}

Class y {

m3 }

Class z {

m4}

Since test T1visits method m1and m1 is changed,then test againP’ with respect toT1

T2 does not needto be re-executed

P is usually representedby a graph

13SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Regression Test Selection Techniques

14SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Regression Test Selection Techniques

» Control-Flow- Based Regression Test

- Control-Flow-Based regression test traverses corresponding paths in G and G’.

- If finds any sink of edge has been changed, then selects the tests that executed the edge in G.

» TestTube

- Based on code entities

- They maintain a function trace list produced by each test case, and expand it to entity trace list.

- To find the changes of the entities, they define that an entity consists of a sequence of tokens for each execution, and the entity considered to be changed if one of these tokens changed.

- tends to choose more tests than Control-Flow-based techniques

15SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Regression Test Selection Techniques

» Incremental Program Testing

- It uses program slicing

- If a statement in P’ have no corresponding statements in P and/or the behaviors of the same statement in P and P’ are not equivalent, then this statement should be considered to be affected

» Domain-Based Regression Testing

- a little bit different from the others

- First update domain by defining the modifications of the commandlanguages, then update test suite and select regression test suite based on the subdomains

16SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Spec-based Regression Testing

17SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Overview (not required for the exam)

» Dependent Relations

A. Firewall: (static) control dependent/ inheritance, aggregation, association… [kung95]

B. FDG: function dependent (+data members)[wu99][wu00]

C. CSIG: control + data flow (spec. + Impl.) [BeydedaGruhn]

D. DejaVu: (CFG control flow) [harrold01nov]

18SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Background

» A Safe, Efficient Regression Test Selection Technique (Gregg Rothermel, Mary Jean Harrold)

- Source code à Control Flow Graph

node: statement

edge: flow of control between statement

- Compare by edge

19SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Background» Testing of object-oriented programs based on finite state machines

(H.S. Hong)

- Source code à Class State Machine

transitions: (source, target, event, guard, action)

» Techniques for Testing Component-Based Software (Ye Wu, Dai Pan, Mei-Hwa Chen)

- Component Interaction Graph (CIG): interactions and dependence relationships among components

> Interfaces nodes: basic access, via which components are activated> Events nodes: the event from the calling interface to the called interface> Context dependence edges: similar to control flow dependence relationship> Content dependence edges: data dependence relationship (regression test for

OOP)

20SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

A. Developing an Object-Oriented Software Testing and Maintenance Environment (David Kung, Jerry Gao)

• ORD (Object Relation Diagram)

– inheritance, aggregation, association…

– Test order

• BBD (Block Branch Diagram)

– Function invocation

– Test path

• OSD (Object State Diagram)

– State behavior for object class

[kung95]

21SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

B. Regression Testing on Object-Oriented Programs (Ye Wu, Mei-Hwa Chen)

• AFDG:

– Affected Function Dependence Graph

– Affected variable / functions, function dependence relations

• FCG:

– Function Calling Graph

– Test case execution history of each test case

AFDG FCG

RTs

CFG P’

[wu99]

22SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

C. Integrating White- and Black-Box Techniques for Class-Level Regression Testing (Sami Beydeda)

• CSM: Class State Machine

• CSIG: Class Specification Implementation Graph

– Prototype: each event within transitions of CSM

– CCFG: Class Control Flow Graph (for each prototype/method)

• Method implementation graph• Method specification graph

– Data-flow: black-box testing cases

• Regression testing

– White-box: compare by node

– Black-box: by edge

CSM

Prototype

CCFG Data-flow(Def-use)

CSIG TestCases

RegressionTest cases

CSIG’

[BeydedaGruhn]

23SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

D. Using Component Metadata to Support the Regression Testing of Component-based Software (Mary Jean Harrold)» Code-based

- Metadata à branch coverage for T.

- DejaVu (CFG)

» Spec-based

- Test frame: test specification for functional unit

- Metadata à Test frame coverage for T.

- DejaVu (at least one of the paths in test frame associate changed statement)

[harrold01nov]

24SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

SA-based Regression Testing

25SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Why?

» Motivations and Goals:

- To reduce the testing effort during architecture or codemaintenance

- To handle the “architectural drift”

- To maximize benefits in using Software Architecture

26SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Why?» Because changes in P may effect only a small set of

architectural interactions, and we do not want to retest everything

» “… the use of software architecture for regression testing activities has the potential for a bigger impact on the cost of software than those techniques that focus only on development testing.” [Harrold_Rosatea98]

» “We need to develop techniques that can be applied to … the …architecture, to assist in selective retest of the software. These techniques will let us identify existing test cases that can be used to retest the software. These techniques will also let us identify those parts of the modified software for which new testcases are required.” [Harrold_FOSE2000]

27SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

SA-based Regression Testing (1/2)» Assumption:

- we have the Software Architecture,

- some architectural tests SAT,

- the P code and

- a function mapping map(SAT,P) à CTp that produces the code-level tests

- we run the CTp tests over P and evaluate the P conformance to SA

» We may change SA in SA’ or P in P’

28SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

SA-based Regression Testing [SARTE Project]

Test Reuse

Test Reuse

Test Reuse

29SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Goal 1

» Goal 1: Test Conformance of a Modified Implementation P' to the initial SA:

- Context:

> Given a software architecture specification for S, and an implementation P, we first gain confidence that P correctly implements S.

> P' modifies P: some components are modified, and/or some new components are introduced.

- Goal: Test the conformance of P' with respect to S, while reusing previous test information for selective regression testing, thereby reducing the test cases that must be retested.

30SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Goal 2

» Test Conformance of an Evolved Software Architecture

- Context:

> P correctly implements the SA S

> S is modified into S’, adding or removing components

> A modified implementation P' may have been also developed.

- Goal: Test the conformance of P (or P‘) with respect to S', while reusing previous test information for selective RT, thereby reducing the test cases that must be retested.

31SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Activity Diagram of our Sa-based Regression Testing approach

32SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Goal 1

33SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Considerations

» Differences with respect to traditional code-based RT techniques:

- the oracle in SA-based RT is the software architecture specification itself.

> In fact, when t1 is run on P', the test fails if its execution does not allow the expected behavior to be reproduced

- Moreover, code-level test cases are always driven by well formalized functional and structural architectural requirements.

34SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Advantages

» i) as in traditional RT, we reduce the size of the test suite for P', eliminating all those tests which do not need to be reapplied to P', and

» ii) when conformance faults are detected, we can gather information on how to adjust the initial architecture.

35SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Goal 2 Idea» Compare the two architectural specifications to identify

changed/unchanged portions of the SA.

» Both structural and behavioral changes are taken into account.

- In particular, the LTSs for S and S'' are compared, and differences are identified in the two graphs (using a sort of ``Diff" algorithm).

» In a fashion similar to traditional code-based RT, whenever an ATC traverses a path in the SA LTS which has been modified in the S'' LTS, then it needs to be retested in S''.

36SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Goal 2

37SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Experiment 1: Elevator System» The approach has been applied to the Elevator case study

- Architecture in the C2 style

- Implemented using the C2 framework

» Tool support:

- The SA is formally specified using the C2 specification language and FSP.

- A behavioral model of the system is automatically produced from the FSP specification, using the LTSA tool.

38SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Elevator SA

BuildingPanel

ElevatorPanel1

ElevatorADT1

ElevatorPanel1

ElevatorADT1

Scheduler

ElevatorPanel2

ElevatorADT2

BuildingPanel

Elevator SystemVersion 1

Elevator SystemVersion 2

39SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Experiment 1

- The abstraction over the LTS behavioral model is realized again in FSP.

- The mapping between architectural tests and code-level tests is provided for by the C2 Framework.

- Test cases are run over the code using the Argus-I environment.

- The selective regression testing step is supported by the DejaVOO tool.

40SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

BuildingPanel

ElevatorPanel1

ElevatorADT1

Elevator SystemVersion 1

ElevatorADT1

Architectural “Paths” (Scenarios)

BuildingPanel

addCall (x,dir)

callAttended (x)

Path #1: BP sends Call; ADT1 attends Call

ElevatorADT1ElevatorPanel1

addCall (x)

callAttended (x)

Path #2: EP1 sends Call; ADT1 attends Call

Test Cases

TC1: BP.addCall(1,up); ADT1(1,up)TC2: BP.addCall(1,up); ADT1(2,up)TC3: BP.addCall(1,up); ADT1(2,down)TC4: BP.addCall(2,up); ADT1(1,up)TC5: BP.addCall(2,up); ADT1(2,down)TC6: BP.addCall(2,up); ADT1(3,down)TC7: BP.addCall(2,down); ADT1(1,up)TC8: BP.addCall(2,down); ADT1(2,up)TC9: BP.addCall(2,down); ADT1(3,up)TC10: BP.addCall(3,down); ADT1(4,down)TC11: BP.addCall(3,down); ADT1(3,down)TC12: BP.addCall(3,down); ADT1(2,down)

TC1: EP1.addCall(1); ADT1(1,up)TC2: EP1.addCall(1); ADT1(2,up)TC3: EP1.addCall(1); ADT1(2,down)TC4: EP1.addCall(2); ADT1(1,up)TC5: EP1.addCall(2); ADT1(2,down)TC6: EP1.addCall(3); ADT1(2,down)

Architectural Paths = 2

Path #1 – Test Cases

Path #2 – Test Cases

41SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Elevator SystemVersion 2

Architectural “Paths” (Scenarios)

Test Cases

Architectural Paths = 4ElevatorPanel1

ElevatorADT1

Scheduler

ElevatorPanel2

ElevatorADT2

BuildingPanel

Path #1: BP sends Call; ADT1 attends Call

SchedulerBuildingPanel

addCall (x,dir)

callAttended (x)

ElevatorADT1 ElevatorADT2

getDistanceToCall (x,dir)

getDistanceToCall (x,dir)

distanceToCall (x,1)

distanceToCall (x,1)

addCall (x,dir)

callAttended (x)

42SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Elevator System v2 test cases

Path #3: EP1 sends Call; ADT1 attends Call

Path #4: EP2 sends Call; ADT2 attends Call(analogous to path #3)

ElevatorADT1ElevatorPanel1

addCall (x)

callAttended (x)

Path #2: BP sends Call; ADT2 attends Call

43SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

SchedulerBuildingPanel

addCall (x,dir)

callAttended (x)

ElevatorADT1 ElevatorADT2

getDistanceToCall (x,dir)

getDistanceToCall (x,dir)

distanceToCall (x,1)

distanceToCall (x,1)

addCall (x,dir)

callAttended (x)

SchedulerBuildingPanel

addCall (x,dir)

callAttended (x)

ElevatorADT1 ElevatorADT2

getDistanceToCall (x,dir)

getDistanceToCall (x,dir)

distanceToCall (x,1)

distanceToCall (x,1)

addCall (x,dir)

callAttended (x)

Elevator System - Version 1Architectural Paths = 2

Elevator System - Version 2Architectural Paths = 4 (or 3?)

Path #1: BP sends Call; ADT1 attends Call

Path #2: EP1 sends Call; ADT1 attends Call

ElevatorADT1BuildingPanel

addCall (x,dir)

callAttended (x)

ElevatorADT1BuildingPanel

addCall (x,dir)

callAttended (x)

Path #1: BP sends Call; ADT1 attends Call

ElevatorADT1ElevatorPanel1

addCall (x)

callAttended (x)

ElevatorADT1ElevatorPanel1

addCall (x)

callAttended (x)

Path #3: EP1 sends Call; ADT1 attends Call

ElevatorADT1ElevatorPanel1

addCall (x)

callAttended (x)

ElevatorADT1ElevatorPanel1

addCall (x)

callAttended (x)

Path #4: EP2 sends Call; ADT2 attends Call

Path #2: BP sends Call; ADT2 attends CallSchedulerBuildingPanel

addCall (x,dir)

callAttended (x)

ElevatorADT1 ElevatorADT2

getDistanceToCall (x,dir)

getDistanceToCall (x,dir)

distanceToCall (x,1)

distanceToCall (x,1)

addCall (x,dir)

callAttended (x)

SchedulerBuildingPanel

addCall (x,dir)

callAttended (x)

ElevatorADT1 ElevatorADT2

getDistanceToCall (x,dir)

getDistanceToCall (x,dir)

distanceToCall (x,1)

distanceToCall (x,1)

addCall (x,dir)

callAttended (x)

Don’t needto retest!

Needto retest!

Architecture-Based Regression Testing

TC1: BP.addCall(1,up); ADT1(1,up)TC2: BP.addCall(1,up); ADT1(2,up)TC3: BP.addCall(1,up); ADT1(2,down)TC4: BP.addCall(2,up); ADT1(1,up)TC5: BP.addCall(2,up); ADT1(2,down)TC6: BP.addCall(2,up); ADT1(3,down)TC7: BP.addCall(2,down); ADT1(1,up)TC8: BP.addCall(2,down); ADT1(2,up)TC9: BP.addCall(2,down); ADT1(3,up)TC10: BP.addCall(3,down); ADT1(4,down)TC11: BP.addCall(3,down); ADT1(3,down)TC12: BP.addCall(3,down); ADT1(2,down)

TC5: EP1.addCall(2); ADT1(2,down)

TC1: EP1.addCall(1); ADT1(1,up) TC2: EP1.addCall(1); ADT1(2,up)TC3: EP1.addCall(1); ADT1(2,down)TC4: EP1.addCall(2); ADT1(1,up)

TC6: EP1.addCall(3); ADT1(2,down)TC5: EP1.addCall(2); ADT1(2,down)

TC1: EP1.addCall(1); ADT1(1,up) TC2: EP1.addCall(1); ADT1(2,up)TC3: EP1.addCall(1); ADT1(2,down)TC4: EP1.addCall(2); ADT1(1,up)

TC6: EP1.addCall(3); ADT1(2,down)

44SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Experiment 2: the Cargo Router example

45SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Architecture(s) of the Cargo Router

Graphics Binding (GB)

CargoRouter(CR)

PortArtist (PA)

VehicleArtist (VA)

NextShipment (NS)

Cargo Router, Version 1

WarehouseArtist (WA)

Port(P)

Vehicle(V)

Warehouse(W)

CargoRouter2(CR2)

PortArtist2 (PA2)

VehicleArtist2 (VA2)

Translator (T)

WarehouseArtist2 (WA2)

Cargo Router, Version 2

Clock(C)

PlannerArtist(PlA)

Planner (P)Bus1

Bus2

Bus2A Bus2B

Bus3ABus3

Bus4

Bus3C

a) b)

Note: In Version2, the connection (*)is replaced with the connections (**)

(*)(**)

(**)

46SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Architectural Test Case

47SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Related Topics

» Regression Testing

- Many many papers on code-based regression testing

- Very few (only two) papers on specification-based regression testing

» SA-level Dependence Analysis

» Code-level Dependence Analysis

» SA-based Testing

48SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

References» Mary Jean Harrold, Testing Evolving Software. Journal of Systems and

software, vol. 47, number 2-3, pp.173-181, July 1999.

» Mary Jean Harrold, Testing: A Roadmap. In Future of Software Engineering, 22nd International Conference on Software Engineering, June 2000.

» Anneliese von Mayrhauser, Richard T. Mraz, Jeff Walls, Domain Based Regression Testing. Dept. of Computer Science, Colorado State University.

» Gregg Rothermel and Mary Jean Harrold, Proc. Of the Conf. On Software Maintenance, Montreal, CA, September 1993, pages 358-367.

» Yih-Farn Chen, David S. Rosenblum and Kiem-Phong Vo, TestTube: A System for Selective Regression Testing. Software Engineering Research Department, AT&T Bell Laboratories, NJ.

» Samuel Bates and Susan Horwitz, Incremental Program Testing Using Program Dependence Graphs, University of Wisconsin-Madison.

49SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

References (Con’t)» Todd L. Graves, Mary Jean Harrold, Jung-Min Kim, Adam

Porter and Gregg Rothermel, An Empirical Study of Regression Test Selection Techniques, ACM Transactions on Software Engineering and Methodology (to appear).

» David Binkley, The Application of Program Slicing to Regression Testing, Loyola College, Maryland.

» Mary Jean Harrold, Architecture-Based Regression Testing of Evolving Systems, The Ohio State University.