Best Practices for the Formal Software Testing...

57

Transcript of Best Practices for the Formal Software Testing...

Page 2: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

BEST PRACTICES FOR THEFORMAL SOFTWARETESTING PROCESS

Page 3: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

Also Available from Dorset House Publishing

The Deadline: A Novel About Project Managementby Tom DeMarcoISBN: 0-932633-39-0 Copyright ©1997 320 pages, softcover

Dr. Peeling's Principles of Management:Practical Advice for the Front-Line Managerby Nie PeelingISBN: 0-932633-54-4 Copyright ©2003 288 pages, softcover

Five Core Metrics: The Intelligence Behind Successful Software Managementby Lawrence H. Putnam and Ware MyersISBN: 0-932633-55-2 Copyright ©2003 328 pages, softcover

Measuring and Managing Performance in Organizationsby Robert D. Austin foreword by Tom DeMarco and Timothy ListerISBN: 0-932633-36-6 Copyright ©1996 240 pages, softcover

Peopleware: Productive Projects and Teams, 2nd ed.by Tom DeMarco and Timothy ListerISBN: 0-932633-43-9 Copyright ©1999 264 pages, softcover

Project Retrospectives: A Handbook for Team Reviewsby Norman L. Kerth foreword by Gerald M. WeinbergISBN: 0-932633-44-7 Copyright ©2001 288 pages, softcover

Surviving the Top Ten Challenges of Software Testing:A People-Oriented Approachby William E. Perry and Randall W. RiceISBN: 0-932633-38-2 Copyright ©1997 216 pages, softcover

Waltzing with Bears: Managing Risk on Software Projectsby Tom DeMarco and Timothy ListerISBN: 0-932633-60-9 Copyright ©2003 208 pages, softcover

For More Information

• Contact us for prices, shipping options, availability, and more.

• Sign up for DHQ: The Dorset House Quarterly in print or PDF.

• Send e-mail to subscribe to e-DHQ, our e-mail newsletter.

• Visit Dorsethouse.com for excerpts, reviews, downloads, and more.

DORSET HOUSE PUBLISHINGAn Independent Publisher of Books on

Systems and Software Development and Management. Since 1984.353 West 12th Street New York, NY 10014 USA

1-800-DH-BOOKS 1-800-342-6657212-620-4053 fax: 212-727-1044

[email protected] www.dorsethouse.com

Page 4: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

BEST PRACTICES FOR THEFORMAL SOFTWARETESTING PROCESS

A MENUOF

TESTINGTASKS

Rodger D. Drabick

s\DH

Dorset House Publishing353 West 12th StreetNew York, NY 10014

Page 5: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

Library of Congress Cataloging-in-Publication Data

Drabick, Rodger.Best practices for the formal software testing process : a menu of testing

tasks / Rodger Drabick.p. cm.

Includes bibliographical references and index.ISBN 0-932633-58-7 (softcover)1. Computer software-Testing. I. Title.

QA76.76.T48D73 2003005.1'4-dc22

2003062472

All product and service names appearing herein are trademarks or registeredtrademarks or service marks or registered service marks of their respective own-ers and should be treated as such.

Cover Design: Nuno AndradeCover Graphic: Detail from Figure 1-5Cover Author Photograph: Courtesy of Lockheed Martin

Copyright © 2004 by Rodger D. Drabick. Published by Dorset House PublishingCo., Inc., 353 West 12th Street, New York, NY 10014.

All rights reserved. No part of this publication may be reproduced, stored in aretrieval system, or transmitted, in any form or by any means, electronic, mechani-cal, photocopying, recording, or otherwise, without prior written permission of thepublisher.

Distributed in the English language in Singapore, the Philippines, and South-east Asia by Alkem Company (S) Pte. Ltd., Singapore; in the English languagein India, Bangladesh, Sri Lanka, Nepal, and Mauritius by Prism Books Pvt.,Ltd., Bangalore, India; and in the English language in Japan by Toppan Co.,Ltd., Tokyo, Japan.

Printed in the United States of America

Library of Congress Catalog Number: 2003062472

ISBN: 0-932633-58-7 12 11 10 9 8 7 6 5 4 3 2 1

Digital release by Pearson Education, Inc., June, 2013

Page 6: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

Dedication

To my wife, Karen, without whose love and support this book and theother things I do would not be possible; and to my daughters, Alysonand Liz—I love you all.

Page 7: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

This page intentionally left blank

Page 8: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

Acknowledgments

Many people—including coworkers, managers, SQA and testing associ-ates, family, and friends—have earned a special Thank You for helpingme bring my ideas from a concept to a published book.

I'd like first to express my thanks—in memoriam—to Peter Natale,formerly of Eastman Kodak Company, who provided me with my firstopportunity in software testing. Without Pete's support and mentor-ship, this book would not have been possible. Thanks also to FrankMondziel, who provided guidance in our first efforts in software QA andtesting at Kodak, and to Ed Cattron, a long-time friend, coworker, andmanager at Kodak, who supported my initial efforts to develop and for-malize this process model on a large and complex Internal RevenueService program in the early 1990's.

I also owe a great debt of gratitude to Bill Perry of the QualityAssurance Institute and Dr. Edward Miller of Software Research, Inc.,both of whom gave me the opportunity to present portions of thismodel at their testing conferences. I made my first presentation on thetopic as a keynote address at a Software Quality Week, hosted by Edand his company. Independent consultant Denis Meredith gave methe opportunity to present the model end-to-end at one of the TestingComputer Software Conferences he coordinated. Denis remains my"go to guy" for questions about requirements and risk management.

vii

Page 9: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

viii ACKNOWLEDGMENTS

Thank you very much, gentlemen! It's been a pleasure knowing you,and learning from you.

To so many at Software Quality Engineering who have been helpfulto me along the way, I give special thanks: Bill Hetzel, former presidentof SQE, and Dave Gelperin and Rick Craig, associates at SQE—all ofyou have provided invaluable technical and moral support throughoutmy testing career.

Lisa Crispin, coauthor of Testing Extreme Programming, introducedme to special techniques for approaching testing of that agile method.For that knowledge and for reviewing a number of draft chapters, Lisa,I am deeply grateful.

It's difficult to adequately express the gratitude I owe to several edi-tors. First, my friend of many years, Debbie Lafferty, formerly Acquisi-tions Editor at Addison-Wesley, started me working on this book. Deb-bie introduced me on-line to Wendy Eakin at Dorset House Publishing,and that introduction has resulted in the final product you hold inyour hands. To Wendy and the rest of the Dorset House staff, includ-ing Nuno Andrade, Vincent Au, and David McClintock, many thanksfor helping me turn a concept into a real book that I hope will providevaluable information to the testing community! Thanks also to DorsetHouse author and software systems consultant Tim Lister, for his rec-ommendation that this book be published.

And finally, a lifetime of gratitude and thanks to Karen, my wifeand friend of many years. Without your help and encouragement, thisbook would never have been started, much less completed.

Page 10: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

Contents

Foreword xiii

Preface xv

Chapter 1 Introduction 3

1.1: OVERVIEW 31.2: THE SOFTWARE DEVELOPMENT LIFE CYCLE 41.3: THE FORMAL TESTING LIFE CYCLE 61.4: ORGANIZATIONAL CONSIDERATIONS 111.5: THE NEED FOR THE PROCESS MODEL 161.6: How то USE THE FORMAL TESTING PROCESS MODEL 171.7: INPUT-PROCESS-OUTPUT MODELS 211.8: LEVEL 0 IPO MODEL FOR FORMAL TESTING 241.9: LEVEL 1 IPO MODEL FOR FORMAL TESTING 251.10: LIMITATIONS INHERENT IN THIS BOOK'S SCOPE 261.11: WHAT'S NEXT? 27

Chapter 2 The Formal Testing Process Model: Level 1 IPO

Diagrams 28

2.1: OVERVIEW 282.2: EXTRACT TEST INFORMATION FROM PROGRAM PLANS—LEVEL 1 IPO

DIAGRAM 28

ix

Page 11: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

x CONTENTS

2.3: CREATE TEST PLAN 342.4: CREATE TEST DESIGN, TEST CASES, TEST SOFTWARE, AND TEST

PROCEDURES 392.5: PERFORM FORMAL TEST 432.6: UPDATE TEST DOCUMENTATION 472.7: WHAT'S NEXT? 50

Chapter 3 Extract Test Information from Program Plans: Levels 2and 3 IPO Diagrams 51

3.1: OVERVIEW 513.2: EXTRACT TEST INFORMATION FROM PROGRAM PLANS—LEVEL 2 IPO

DIAGRAM 523.3: REVIEW PROGRAM MANAGEMENT PLAN—LEVEL 3 IPO DIAGRAM 563.4: REVIEW QUALITY ASSURANCE PLAN—LEVEL 3 IPO DIAGRAM 603.5: REVIEW SOFTWARE DEVELOPMENT PLAN—LEVEL 3 IPO DIAGRAM

643.6: REVIEW CONFIGURATION MANAGEMENT PLAN—LEVEL 3 IPO

DIAGRAM 693.7: WHAT'S NEXT? 72

Chapter 4 Create Test Plan: Levels 2 and 3 IPO Diagrams 734.1: OVERVIEW 734.2: CREATE TEST PLAN—LEVEL 2 IPO DIAGRAM 744.3: ANALYZE REQUIREMENTS—LEVEL 3 IPO DIAGRAM 784.4: WRITE TEST PLAN—LEVEL 3 IPO DIAGRAM 834.5: REVIEW TEST PLAN—LEVEL 3 IPO DIAGRAM 894.6: WHAT'S NEXT? 93

Chapter 5 Create Test Design, Test Cases, Test Software, and TestProcedures: Levels 2 and 3 IPO Diagrams 94

5.1: OVERVIEW 945.2: CREATE TEST DESIGN, TEST CASES, TEST SOFTWARE, AND TEST PRO-

CEDURES—LEVEL 2 IPO DIAGRAM 955.3: REVIEW TEST PLAN—LEVEL 3 IPO DIAGRAM 1025.4: CREATE TEST DESIGN—LEVEL 3 IPO DIAGRAM 1055.5: REVIEW TEST DESIGN—LEVEL 3 IPO DIAGRAM 1095.6: CREATE TEST CASES—LEVEL 3 IPO DIAGRAM 1135.7: ACQUIRE TEST SOFTWARE-BUILD—LEVEL 3 IPO DIAGRAM 119

Page 12: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

CONTENTS xi

5.8: ACQUIRE TEST SOFTWARE-BUY—LEVEL 3 IPO DIAGRAM 1235.9: CREATE TEST PROCEDURES—LEVEL 3 IPO DIAGRAM 1265.10: WHAT'S NEXT? 131

Chapter 6 Perform Formal Test: Levels 2 and 3 IPO Diagrams 1326.1: OVERVIEW 1326.2: PERFORM FORMAL TEST—LEVEL 2 IPO DIAGRAM 1336.3: HOLD PRETEST MEETING—LEVEL 3 IPO DIAGRAM 1376.4: EXECUTE TEST—LEVEL 3 IPO DIAGRAM 1416.5: DETERMINE DISPOSITION OF INCIDENTS—LEVEL 3 IPO

DIAGRAM 1466.6: HOLD POSTTEST MEETING—LEVEL 3 IPO DIAGRAM 1516.7: WRITE TEST REPORT—LEVEL 3 IPO DIAGRAM 1556.8: WHAT'S NEXT? 159

Chapter 7 Update Test Documentation: Levels 2 and 3 IPO Dia-grams 160

7.1: OVERVIEW 1607.2: UPDATE TEST DOCUMENTATION—LEVEL 2 IPO DIAGRAM 1617.3: ANALYZE TEST DOCUMENTATION PROBLEMS—LEVEL 3 IPO

DIAGRAM 1647.4: UPDATE TEST DOCUMENTS—LEVEL 3 IPO DIAGRAM 1677.5: REVIEW AND APPROVE SOFTWARE TEST DOCUMENTS—LEVEL 3 IPO

DIAGRAM 1707.6: WHAT'S NEXT? 173

Chapter 8 Tailoring the Model 1748.1: OVERVIEW 1748.2: USING THE TESTING PROCESS MODEL FOR LARGE, SAFETY-CRITICAL,

OR PROFIT-CRITICAL PROGRAMS 1758.3: USING THE TESTING PROCESS MODEL FOR MEDIUM-SIZE

PROGRAMS 1788.4: USING THE TESTING PROCESS MODEL ON SMALL PROJECTS 1798.5: USING THE PROCESS MODEL WITH EXTREME PROGRAMMING 1818.6: STARTING FROM SCRATCH 183

Chapter 9 Summing Up 1889.1: REPRISE: How то USE THE TESTING PROCESS MODEL 1889.2: SUMMARY THOUGHTS 1939.3: CONCLUSION 196

Page 13: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

xü CONTENTS

Appendices 197

Appendix A: The Software Engineering Institute, the CapabilityMaturity Model-Software, and the Capability Maturity ModelIntegration 199

A. 1: THE CAPABILITY MATURITY MODEL FROM THE SOFTWARE ENGINEERINGINSTITUTE AT CARNEGIE MELLON UNIVERSITY 199

A. 2: CAPABILITY MATURITY MODEL INTEGRATION 202

Appendix B: Preferred Practices 204B.I: PROGRAM MANAGEMENT PLAN AND TEMPLATE 204B.2: SOFTWARE DEVELOPMENT PLAN AND TEMPLATE 213B.3: SOFTWARE QUALITY ASSURANCE PLAN AND TEMPLATE 221B.4: CONFIGURATION MANAGEMENT PLAN AND TEMPLATE 229

Appendix C: Questionnaire for Evaluating Local TestingProcesses 249

C.I: EXTRACT TEST INFORMATION FROM PROGRAM PLANS 249C.2: CREATE TEST PLAN 252C.3: CREATE TEST DESIGN, TEST CASES, TEST SOFTWARE, AND TEST PRO-

CEDURES 254C.4: PERFORM FORMAL TEST 257C.5: UPDATE TEST DOCUMENTATION 259

Appendix D: A Primer for Test Execution 260

Glossary 267

Bibliography 275

Index 281

Page 14: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

Foreword

Rodger and I first met in 1985 at the Testing Computer Software Con-ference, which I coordinated. This was Rodger's first software testingconference, and like many folks new to software testing, he was deter-mined to learn as much as he could about the fascinating field of soft-ware testing in the limited time available. At the time, Rodger hadbeen working as a team lead for system testing on a large software pro-gram for a government customer.

Over the years, I watched Rodger's career develop, and witnessedas his focus changed from learning and using testing techniques, totesting management, and finally to concentration on the software test-ing process. In the mid-1990's, Rodger presented aspects of hisprocess model at the Quality Assurance Institute's International Con-ference on Software Testing, the same process model that this bookdescribes in detail.

Best Practices for the Formal Software Testing Process: A Menu ofTesting Tasks is the product of the knowledge that Rodger has gainedthrough personal experience as well as from seminars and conferencesput on by the QAI and other well-respected testing organizations.

Primarily intended to help new test engineers and test engineeringmanagers understand the steps involved in testing software systemsand hardware-software systems, this book will be useful to any testingorganization interested in documenting and improving its testing

xiii

Page 15: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

xiv FOREWORD

process. It begins by showing you how to develop a test plan based onthe software and system requirements, which identifies exactly whatwill be tested, and goes on to detail the tasks and processes involved inexecuting the test and documenting the results of the testing.

While this book focuses primarily on testing associated with largedevelopment programs—using either a spiral life cycle or the UnifiedSoftware Development Process, with its iterative phases—the informa-tion Rodger offers can be put to the most effective use through tailor-ing the process to fit your specific environment.

Readers familiar with the testing documentation and testing lifecycle contained in IEEE-Standard-829, the Standard for Software TestDocumentation, will readily see the links to the process model detailedin this book. Nevertheless, those working with other life cycles willfind guidance in tailoring this testing process to fit those life cycles.

Best Practices for the Formal Software Testing Process serves as afine example of What to Do to Test. As Rodger writes, this is the bookhe wished he'd had when he started his software testing career. Thisbook should be in the hands of every member of every software testingorganization. If you buy this book, plan to use it; don't just set it onyour bookshelf.

September 2003 William E. PerryOrlando, Florida

Page 16: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

Preface

What is "the formal software testing process"?One of the definitions the Institute of Electrical and Electronics

Engineers (IEEE) Software Standards collection provides for process is"a course of action to be taken to perform a given task" or "a writtendescription of a course of actions, for example, a documented test pro-cedure." Various editions of Webster's Dictionary offer a series of defi-nitions for process, including "a particular method of doing something,generally involving a number of steps or operations."

A simpler definition of a process is "the way we do things."Thus, we could say that the "testing process" is "a course of action

to be taken to perform formal testing," or "the way we do testing here."But, how does "the way we do testing here" stack up against indus-

try standards for best testing practices, and why did I go to all theeffort to define a Formal Testing Process Model? Let me share a shortpersonal retrospective.

I have spent twenty-eight years of a mostly exciting technicalcareer testing and managing testing programs. These programsstarted out as hardware programs, and evolved into hardware andsoftware systems programs in the mid-1970's. I was lucky in somerespects to work for a large company in the Department of Defense(DoD) area; thus, we were always in an environment that forced us tobe rigorous and methodical, both in test execution and in test docu-

xv

Page 17: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

xvi PREFACE

mentation. Note that being methodical didn't always lead to deliveringsuccessful systems.

When I first began working in programs that were primarily soft-ware, I was looking for sources of material on "best practices" relative tothe software testing process. Being in the DoD environment, our cus-tomers supplied Contract Deliverable Requirements Lists (CDRLs) thatidentified the documents we had to write. Thus, we had a frameworkfor our programs. Later, I discovered a number of military standards(for example, DoD-Standards 2167A and 2168), and the IEEE SoftwareEngineering Standards, which I continue to use to this day. Followingthe series of documentation contained in IEEE-Standard-829, the Stan-dard for Software Test Documentation, suggested a standard process toutilize for software and systems test documentation.

In 1992, while working in the Commercial and Government Sys-tems Division of the Eastman Kodak Company, my job was to developa testing estimate for a program we were bidding for the Internal Rev-enue Service. Having built estimates previously for large, complex,testing programs, I knew that I needed a logical, structured list oftasks to form the basis of my estimate. I could then convert this tasklist into a work breakdown structure (WBS), and estimate the time toperform each task.

About this same time, I was reading Watts Humphrey's fascinatingbook Managing the Software Process (Humphrey, 1989). Watts showedhow to use Input-Process-Output (IPO) diagrams to document aprocess; I applied that technique to the testing process. That was thefirst formal documentation of the top levels of the model you will seedescribed in this book. I published an article describing how aspectsof this model could be used in estimating, in 1993 in the June issue ofEd Yourdon's American Programmer newsletter (Yourdon, 1993).!That's one use of such a model. Over the years, I continued to refinethe model until it became the fairly detailed model presented in thisbook.

Incidentally, our prime contractor accepted my estimate, and bothit and the Internal Revenue Service accepted the overall estimate.

The first light bulb that lit as I began to learn about software test-ing is that testing has a life cycle that is closely linked to the softwaredevelopment life cycle. What seems like a logical truth in 2003 was areal revelation to many in the mid-1970's. I still think that the "V-dia-gram" is a thing of beauty. The V-diagram is a means of illustratingthe relationship between the concurrent software development andtesting life cycles (see Figure 1-2 for one example of the V-diagram).Unfortunately, there are still a large number of people and corpora-

1 Rodger D. Drabick, "A Process Model Tool for Estimating Software Costs," American Pro-grammer, Vol. 6, No. 6 (1993), pp. 20-27.

Page 18: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

PREFACE xvii

tions developing software who haven't recognized this basic truth. Wewrite a test plan while requirements are being developed. We then dotest design and write test cases as the software design is being devel-oped, and write test procedures during the coding phase.2 Executionof formal testing occurs following unit testing and integration testing.

The second light bulb that lit pertained to the critical importance oftest planning, and the consequent need for "testable requirements."Many good books have been written on these subjects (Cause andWeinberg, 1989; Wiegers, 1999), so I will not belabor the point here.

The third light bulb that was turned on was realizing that to prop-erly estimate testing programs, we need a structured, internally con-sistent list of tasks that test engineers will perform on a specific pro-gram. Once such a list of tasks is in hand, we can then estimate eachtask individually to get a bottoms-up estimate for a testing program.And if we perform the same set of tasks (or a very similar set) on eachprogram, we can begin to establish accurate metrics for how much ourtesting costs, and how long it can be expected to take.

In brief, what I am providing in this book is a soup-to-nuts list oftasks and processes for a program of formal software or system testing.If you are just starting to implement a testing process, you will not wantto try to implement all parts of this model in one fell swoop; it would betoo overwhelming and too costly. In later chapters, I suggest ways toimplement this process in a prioritized, piecewise fashion.

Objectives

There is no "one true way" to test software, so the goal of this book isto provide you with information that you can use to develop a formaltesting process that fits the needs of you, your customers, and yourorganization.

Glenford Myers woke up the testing community in 1979 with hisbook The Art of Software Testing; he stated that the purpose of testingis to find errors, rather than to prove that a system works. Regardlessof what your opinion of his thesis is, we test to see if we can find anyproblems with developed systems. A best-practices test process is per-formed throughout the duration of the development process. The testprocess puts appropriate emphasis on requirements testability, docu-mentation prior to test execution, review of the documentation, sys-tematic interface with the development community, and test executionaccording to the approved documentation. A series of tasks to accom-plish this testing process is presented in this book.

^See the Glossary at the end of this book for a definition of "test case" as there is muchconfusion in the industry regarding this term.

Page 19: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

xviii PREFACE

In addition to defining the sequence of tasks in the testing process,I address interfaces between the formal testing group and the otherorganizations involved in the software development life cycle, includingdevelopment engineers, managers, software quality engineers, andsoftware configuration management personnel. All of these peoplehave significant roles that must be performed in a quality manner tomake the software development process and the software testingprocess work. One of the primary purposes of test documentation iscommunication with the other groups on the program, so we as testengineers must be sure to make these intergroup interfaces work. Allof us should be aware that there's a Level 3 key process area (KPA) inthe Software Engineering Institute's Capability Maturity Model(SEI-CMM) for Intergroup Coordination. Early in the development lifecycle, test engineers and managers review program-level plans, includ-ing the Program Management Plan (PMP), the Configuration Manage-ment Plan (CMP), the Software Quality Assurance Plan (SQAP), and theSoftware Development Plan (SDP), or whatever you call these docu-ments in your environment. During the requirements phase of thesoftware development life cycle, test engineers review the require-ments. Later in the life cycle, the groups that authored these plansand the requirements get to return the favor by reviewing the test plan.Test engineers review software design (and sometimes code) for inputto their test designs, test cases, and test procedures. Once again,development engineers, QA staff members, and CM personnel shouldreview the test designs, test cases, and test procedures. As has beenshown in other reference materials (Wiegers, 2002), peer reviews mini-mize defects and cost-of-quality by finding defects as early as possiblein the life cycle.

Intended Audience

I have written this book for four specific audiences, who have much incommon:

• new test engineers, who want to learn more about theframework of tasks performed as part of a good testingprocess

• newly assigned test managers and team leaders, whoare not experienced testers (but may have come from adevelopment, quality assurance, or systems engineeringbackground) and who need to learn more about testingquickly

Page 20: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

PREFACE xix

• more experienced test engineers, who are looking forways to improve their testing process

• process improvement leaders, such as members of soft-ware engineering process groups and quality assurancestaff

All of these folks realize that testing is necessary to verify the quality ofthe delivered product(s). They should also be aware that a coordinatedprogram of peer reviews and testing can support a good software devel-opment process, but if the developers fall short of building a qualityproduct, it's not possible to "test quality in."

Organizations that are pursuing Level 3 of the Capability MaturityModel for Software (for additional detail on the Capability MaturityModel from the Software Engineering Institute at Carnegie Mellon Uni-versity, see Appendix A and Website www.sei.cmu.edu) will also findthis book valuable, since testing is a significant component of theSoftware Product Engineering key process area for CMM Level 3. Thenewer Capability Maturity Model Integration (CMMI), with componentsof Systems Engineering, Software Engineering, Integrated Product andProcess Development, and Supplier Sourcing, has process areas forVerification and for Validation, both of which involve aspects of testing.This book will assist them in documenting their testing process, whichwill support them in the Level 3 key process area Organization ProcessDefinition.

There's actually a fifth audience as well, those software develop-ment team leads, supervisors, and managers who are interested inlearning more about what test engineers and the testing group aredoing, and why.

Prerequisites

What should you know to read this book? You should be aware thattesting is not a "phase," where software developers "throw the softwareover the wall" to test engineers when the developers have finishedcoding. To be cost effective, testing needs a life cycle that runs concur-rently with the software development life cycle. The testing life cyclebegins during the requirements elucidation phase of the softwaredevelopment life cycle, and concludes when the product is deemedready to install or ship following a successful system test (or whateveryou call the final test in your culture).

Ideally, you should be aware of the importance of test documentati-on, especially a formal test plan, in your life cycle. You should be

Page 21: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

xx PREFACE

aware that there are good standards for this documentation in thesoftware industry; I strongly recommend the IEEE Software Engineer-ing Standards collection in general, and IEEE-Standard-829, the Stan-dard for Software Test Documentation, in particular.

Finally, you should have the desire to improve your knowledge ofthe testing process, and a commitment to make improvements.

How to Read This Book

This book is laid out in the following order:

1. Chapter 1 provides a general overview of the softwaredevelopment life cycle, and an overview of the concur-rent testing life cycle. It also shows the need for a test-ing process model, and illustrates the Level 0 and Level1 IPO diagrams for this model.

2. Chapters 2 through 7 show decomposition of the Level1 IPO diagram into a series of Level 2 and Level 3 dia-grams, to show the tasks involved in the review of pro-gram plans, the creation of a test plan, the creation oftest design (and other test documentation), the perfor-mance of formal test, and finally, the updating of testdocumentation. This set of tasks can be used as thebasis for an estimate of a formal testing program on aparticular development program.

3. Chapter 8 provides some thoughts on how to use andcustomize this process model, which is a soup-to-nutsmodel. You may not be in a position to use such amodel initially, but on a large program, putting thisprocess in place should be your goal. Customizing themodel should be your goal on a medium- or small-sizeproject. Thus, this model provides a template for a test-ing process improvement program. To end the textitself, Chapter 9 provides a brief summary of the modelin its entirety.

4. To conclude the book, I've added four practical Appen-dices, a Glossary, a Bibliography, and, of course, anIndex. Although most of these final sections are stan-dard fare, the Appendices bear describing. Appendix Aprovides a brief description of the Capability MaturityModel for Software (CMM-SW) from the Software Engi-neering Institute at Carnegie Mellon University, as well

Page 22: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

PREFACE xxi

as a brief description of the SEI's more recent CapabilityMaturity Model Integration (CMMI). Appendix В pro-vides templates for five preferred practices that are sig-nificant to a software development program, including aProgram Management Plan (PMP), a Software Develop-ment Plan (SDP), a Software Quality Assurance Plan(SQAP), a Configuration Management Plan (CMP), and aTest Plan (TP). Appendix С provides a questionnaireyou can use to evaluate the state of your testingprocess. Let this guide your use of the Formal TestingProcess Model as a mechanism to help you improveyour testing process. Appendix D contains guidelinesfor test execution.

To gain a detailed understanding of the tasks, inputs, and outputsinvolved in formal testing, you can simply read the book from front toback. For readers desiring to explore specific aspects of the testingprocess (for example, test execution), I would recommend skimmingChapter 1 (Introduction), reviewing Chapter 2 (The Formal TestingProcess Model: Level 1 IPO Diagrams) to identify which process youwant to explore in more detail, and then proceeding directly to thatchapter (for one example, test execution, see Chapter 6, "Perform For-mal Test," and Appendix D). Though there are sections for each sub-process of the major processes (for example, subprocess Execute Test,major process Perform Formal Test), I would suggest there are enoughinterfaces between the various subprocesses that the entire sectionshould at least be skimmed.

I don't spend much time discussing the topic of testing tools andtest automation, because these are really software development pro-grams, not test processes. However, there are two subprocessesdefined for Acquire Test Software in Chapter 5, which should bereviewed. One subprocess deals with building your own test softwaretools; the second subprocess addresses the activities involved in buy-ing the tools you need.

I hope that this book will start you thinking about the followingquestions:

• Does our current testing process have all the tasks weneed to identify defects in the products under test? Ifnot, which tasks should we add first to improve ourtesting process?

• Have we identified the next step in our continuousprocess improvement plan?

Page 23: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

xxü PREFACE

• Do we have the necessary reviews in our testing process(for example, Test Plan inspection)?

• Are we writing all the test documentation we need toprovide high-quality repeatable tests?

• Is testing and test coverage adequate for our program(for example, enough time, enough test points)?

If you can answer yes to all these questions, then you probably don'tneed to read this book in its entirety. But are you sure that your testprocess is as good as it can be?

There is one additional caveat to keep in mind as you read thisbook: You will not find the term "bugs" used. I once heard a testingguru state at a conference that he didn't like the term "bugs," since theterm implies that bugs have a life of their own. Those of you with amemory for computer trivia will remember that the term originatedwhen workers on an early Navy computer found that a large moth hadbecome trapped between the contacts of a relay, causing the computerto malfunction.3 Well, folks, in software, defects don't jump in bythemselves. Human beings, as software engineers, put defects in code.Now, the defects may not all be our fault, since we may just be doingwhat we were told to do when the customer gave us some incorrectrequirements, but nonetheless, people put defects in; defects don't justappear spontaneously. So, in this book, you'll see the terms "defects,""incidents," and "problems," but you won't see the term "bugs."

Scope of Coverage

This book will present a description of a process and a set of tasks thatcan be used to implement or improve a formal testing program. For-mal testing means test activities performed by a group of trained pro-fessional test engineers whose chain of command is independent ofdevelopment management. Examples of formal testing are systemtesting and acceptance testing. By contrast, informal testing consistsof test activities performed by the development group. Examples ofinformal testing are unit testing and component testing. Integrationtesting may also be informal testing, if members of the software devel-opment group perform that testing.

In addition, this book will identify the testing tasks, the inputs andoutputs associated with each task, presenting them by means of clas-sical Input-Process-Output diagrams. Tasks that span the testing life

ЗАБ a Lieutenant in the United States Navy, the late Retired Rear Admiral Grace Hopperand her team discovered a moth trapped between the contacts of a relay, preventing therelay from passing current. As a result of this episode, Hopper is credited with coiningthe terms "bug" and "debug," to describe computer errors and how to fix them.Source: People and Discoveries, www.pds.org/wgbh/aso/databank/entries/btmurr.html.

Page 24: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

PREFACE xxiii

cycle, from test planning through test execution and beyond, will alsobe addressed.

Features

This book addresses testing activities and tasks that span the entiretesting life cycle—from review of program plans, to developing the for-mal test plan, through developing the remainder of the test documen-tation (test design, test cases, test procedures) as well as acquiring anyneeded automated testing tools, through test execution, to final updat-ing of the test documentation after test execution is complete.

The test activities and tasks will be represented using the tech-nique of classical Input-Process-Output (IPO) diagrams, as suggestedby Watts Humphrey (Humphrey, 1989). The test activities, tasks, andprocesses are in accordance with those documented in IEEE-Standard-829, the Standard for Software Test Documentation, and are tech-niques suggested by a number of testing specialists, including person-nel at the Quality Assurance Institute (Orlando, Florida) and SoftwareQuality Engineering (Jacksonville, Florida). These are techniques Ihave used for many years on a variety of hardware and software pro-grams, of large and small size. Hopefully, use of this method andadoption of some or all of the processes will help you as much as theyhave helped me.

Readers should keep one additional point in mind: The full-blownmodel described in this book details a full-featured formal testingprocess that is applicable to large programs and that would fully sup-port programs deliverable to state and federal governments, or on pro-grams delivering safety-critical systems or having significant impact oncorporate profits. For work on smaller systems, or in organizationswhere a formal testing process is just getting organized, this processmodel must be significantly tailored and reduced in scope. Guidancein tailoring this process model and using it on projects and programsof various sizes is provided in Chapter 8. As with any processimprovement opportunity, don't try to do everything at once. Prioritizeyour process improvements and proceed slowly and methodically.Plan your work on small projects so as to have lots of little victories. Intoday's sound-bite management environment, it is very important toshow lots of accomplishments. It is essential to make sure you canpublicize your and your group's accomplishments if you are going tosurvive in this new world of short-term planning and thinking.

Page 25: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

xxiv PREFACE

Benefits to the Reader

Why should you, the reader, care about a book that provides a detaileddefinition of a software testing process? A short answer is, becauseyou need to learn more about how to do testing; this means knowingwhat the testing process should be. Another answer is, you believe thetesting process you are now using could be improved and you needsome guidance.

This book, and the model detailed in it, is designed to fill bothneeds.

If you are a new test engineer or test engineering manager who hasnot spent ten-to-twenty years in the testing arena, such a model canserve as a confirmation of the "good things you are doing" and a usefulguide to "things you should be doing." In Critical Testing Processes,testing expert Rex Black goes into a beautifully detailed set of steps fordeveloping a testing estimate (Black, 2003). Use of a process modelsuch as the one I define in this book can identify a list of tasks thatform the foundation for such an estimate.

At the risk of repeating myself, I'd again like to note: My goal is toaddress testing activities and tasks that span the entire testing lifecycle, from reviewing program plans, to developing the formal testplan, through developing the remainder of the test documentation (testdesign, test cases, test procedures) as well as acquiring any neededautomated testing tools, through test execution, to final updating ofthe test documentation after test execution is complete. To do all this,you can use the technique of IPO diagrams shown in the followingchapters to model your current testing process. This approach willprovide you with a model of a software system that is an analog to thecurrent logical model defined in the many seminars and books on thestructured methods put out during the 1970's and 1980's by staff andconsultants at Yourdon, Inc., and published by Yourdon Press.4 Youcould then take your current logical model and compare it with themodel defined in this book, and decide whether any elements are miss-ing from your process. Finally, you could start a process improvementactivity to implement some of those missing elements, so that yourprocess will be improved. An improved testing process should result inyour identifying more defects prior to delivering your products, eitherby finding those defects prior to coding (through reviews associatedwith developing test documentation), or by finding defects during testexecution because your improved procedures result in better testing.

4 Although I could cite many seminars and books, the seminal works on structuredmethods seem to me to be ones published by Yourdon Press (Yourdon and Constantine,1975; DeMarco, 1978; Page-Jones, 1980; McMenamin and Palmer, 1984).

Page 26: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

PREFACE xxv

Before you jump into the book, let me offer another advisory note:Read the Glossary. Currently, there is ambiguity in the testing indus-try regarding some of the terms we use on a daily basis, such as "testplan," "test cases," and "test procedures." This book adheres to theterminology from IEEE-Standard-610.12, the Standard Glossary ofSoftware Engineering Terminology. Many companies use the term"test plan" to encompass all test documentation. In such an instance,the IEEE-type test plan is the first chapter, and test cases and testprocedures are contained in appendices. That isn't what is meant by"test plan" in this book. Similarly for the term "test cases"—in somecompanies, test cases are considered to be the step-by-step instruc-tions for executing tests. This book uses the term "test procedures" tocover those step-by-step instructions; in IEEE terminology, "test cases"contain the input data and expected results when input data are inputto the system under test. So, keep this book's Glossary handy to avoidconfusion.

Now that you have finished this Preface, turn to Chapter 1, "Intro-duction," and begin to read the book itself. Hopefully, you'll enjoyreading it as much as I have enjoyed developing the model and writingthe book.

August 2003 R.D.D.Columbia, Maryland

Page 27: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

This page intentionally left blank

Page 28: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

BEST PRACTICES FOR THEFORMAL SOFTWARETESTING PROCESS

Page 29: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

This page intentionally left blank

Page 30: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

Chapter 3Extract Test Information

from Program Plans:Levels 2 and 3 IPO Diagrams

3.1: Overview

In Chapter 2, we reviewed the five Level 1 IPO diagrams for the formaltesting process, as listed below:

1. Extract Test Information From Program Plans, process1.1

2. Create Test Plan, process 1.23. Create Test Design, Test Cases, Test Software, And Test

Procedures, process 1.34. Perform Formal Test, process 1.45. Update Test Documentation, process 1.5

In this chapter, I expand the detail of the Level 1 IPO diagram pre-sented in Chapter 2 for process 1.1, Extract Test Information FromProgram Plans. The goal of this process is to extract any requirementsfor testing from the higher-level plans, so that the scope of the testeffort can be identified.

There can be a significant variation in the specific program plansassociated with different organizations and the various projects workedon by those organizations. Requirements for program plans may varyby customer, with some programs requiring formal Program Manage-

51

Page 31: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

52 BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS

ment Plans, Configuration Management Plans, Quality AssurancePlans, and others. Other programs may require various formal andinformal forms of documentation plans, build plans, marketing plans,and sales plans. Sample plans are listed below, and are typical ofplans required on a large program in a highly regulated environment.You can customize your process model to show the specific plans yourprogram requires.

In the Level 2 IPO diagram (see Figure 3-2), four subprocesses areincluded within one possible instance of Extract Test Information FromProgram Plans, process 1.1:

1. Review Program Management Plan2. Review Quality Assurance Plan3. Review Software Development Plan4. Review Configuration Management Plan

Following analysis of these four subprocesses, we'll look at the Level 3IPO diagrams for each of the four.

Preferred practices, including a template and table of contents(TOC) for each of these documents, are shown in Appendix B.

In this chapter, we change the format somewhat from the IPO dia-grams that were presented in Chapter 2, in which an individual IPOdiagram was used to illustrate each process.

In this and subsequent chapters, "combined" IPO diagrams providesupporting detail to the Level 1 IPO diagrams that appeared in Chapter2. These combined IPO diagrams are similar in format to Figure 1-7,which showed a group of IPO diagrams, connected in series, that pro-vided additional detail to the Level 0 IPO diagram of Figure 1-6.

3.2: Extract Test Information from Program Plans-Level 2 IPO Diagram

The IPO diagram for process 1.1, Extract Test Information From Pro-gram Plans, is shown in Figure 3-1. The accompanying Level 2 IPOdiagram is shown in Figure 3-2. As named in the preceding section,there are four distinct subprocesses associated with these IPO dia-grams, which in this listing have been shortened to first-initial abbrevi-ations.

1. Review PMP2. Review QAP3. Review SDP4. Review CMP

Page 32: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

3: EXTRACT TEST INFORMATION FROM PROGRAM PLANS: LEVELS 2 AND 3 53

Figure 3-1: The goal of process 1.1, Extract Test Informa-tion From Program Plans, is to reveal whatsorts of requirements are mandated by higher-level plans.

Figure 3-2: Process 1.1 is expanded here to become aLevel 2 IPO diagram.

Page 33: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

54 BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS

3.2.1: Inputs

The four program plans—Program Management Plan (PMP), QualityAssurance Plan (QAP), Software Development Plan (SDP), and Confi-guration Management Plan (CMP)—as well as the standards and tem-plates that provide the formats for these documents, are the principalinputs to this IPO diagram.

If your program uses a different set of program plans, just insertthe names of your program plans in the appropriate process blocks,and implement the tasks shown in the Level 3 IPO diagrams for thoseplans.

3.2.2: Feedback

Results of the review of the four program plans are fed back into thisprocess. When updates to the plans are made and the plans are sub-mitted for a subsequent review, these review results are factored intothat review.

3.2.3; Feedback to Earlier Process Steps

Issues identified in each of the four plans are fed back to their respec-tive authors.

3.2.4; Process Flow, Level 2 IPO Diagram

This testing process is generally concerned with reviewing other pro-gram- or project-level documents, to see what requirements they con-tain that will impact or influence the formal testing staff and the sys-tem test plan. Usually, most of these documents are developed duringthe planning portion of the life cycle (see Figure 1-1) that precedes therequirements phase. As far as the testing process is concerned, theorder in which the documents cited are reviewed is immaterial. How-ever, since the Program Management Plan (see IEEE-Standard-1058for suggested content and format) is the document to which all otherprogram plans must be linked and synchronized, the PMP should bereviewed first. *

Note that the review of program plans to extract test informationfrom those program plans links to CMMI process area Integrated Proj-ect Management, and to the practice Manage Dependencies within thespecific goal Coordinate with Relevant Stakeholders.

Ipor additional information, see CMMI process area Project Planning, discussed inAhem et al., 2001, or the CMMI section of the SEI's Website, www.sei.cmu.edu.

Page 34: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

3: EXTRACT TEST INFORMATION FROM PROGRAM PLANS: LEVELS 2 AND 3 55

Quality assurance personnel or software quality assurance staffmembers are responsible for performing functions and tasks that con-tribute to the quality of the processes and products on the project.^(See IEEE-Standard-730 for suggested content and format, and Dra-bick, 2000, for more details regarding software quality assurance func-tions.) These functions and tasks are normally documented in a for-mal Quality Assurance Plan. Since QA has many interfaces with testengineering (ТЕ), the QA functions can impact the test plan.

The Software Development Plan is normally only written on largeprograms or projects, and identifies the tasks that software engineersare assigned to perform. Since there are many interfaces between thesoftware engineering and test engineering divisions in a company, thisplan, if developed, is of critical importance to development of the sys-tem test plan. This plan should identify the informal and semiformaltesting tasks that software engineering performs (for example, unittesting, component structural testing, and integration testing).

Configuration management is normally responsible for maintainingthe versions of documentation and software, and is often responsiblefor software builds. These functions and tasks are normally docu-mented in a formal Configuration Management Plan (see IEEE-Stan-dard-828 for suggested content and format). Since configuration man-agement maintains the integrity of the code that test engineering istesting, the interfaces between CM and the test team are very impor-tant, and the CM functions can impact the test plan.

Most organizations will have available a set of standards and tem-plates to use in development of documentation and computer software.IEEE and a variety of military standards provide such templates. ACDRL can also include data item descriptors that specify the organiza-tion of deliverable documents. As in other instances, use of such stan-dards and templates can prevent an organization from having to rein-vent the wheel when creating documentation. While test engineeringstaff members review the program plans, they and QA personnelshould be sure that the plans are in conformance with the applicablestandards and templates used on the program.

3.2.5: Outputs

The reviewed program plans, with included comments, are one majoroutput of this subprocess. The other major output of this subprocessis the testing requirements extracted from each of the program plans.

2 Remember that I use the abbreviations "QA" and "SQA," as well as the terms they standfor, synonymously in this book, although there can be significant differences betweenthe two on a program or project that involves both hardware and software products.

Page 35: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

56 BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS

One example of a testing requirement is the schedule for formal testingcontained in the PMP. Another example is the description of thedefect- or incident-handling procedure normally documented in theQAP.

The comments in the reviewed program plans will identify anyissues that test engineering staff members have with those documents.The issues may be simple inconsistencies between the various docu-ments (for example, the program plan specifies one set of test levels,and the QAP specifies a different set), or they may indicate philosophi-cal problems between the various groups. Again, early resolution ofthese problems will simplify matters later in the program.

3.2.6; Feedback from Later Process Steps

It is quite possible that not all the inconsistencies or problems with thevarious program plans will be identified in the review. As other prob-lems are identified during later phases in the program life cycle (mostcommonly during the detailed design phase), it may be necessary toupdate one or more of the program plans to resolve those problems.As changes are made to the program plans to incorporate those fixes,some additional review of the plans by test engineering staff (and alsoby quality assurance personnel) will be required.

3.3: Review Program Management Plan—Level 3 IPODiagram

The Level 3 IPO diagram for subprocess 1.1.1, Review Program Man-agement Plan, is shown in Figure 3-3. Significant inputs for this sub-process are straightforward:

1. Program Management Plan2. standards and templates

The outputs from this process include four significant products:

1. Program Management Plan following review by the per-sonnel in the test engineering (ТЕ) area

2. review comments (from ТЕ)3. PMP issues (including format and testing issues)4. testing requirements (including testing life cycle

requirements)

Page 36: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

3: EXTRACT TEST INFORMATION FROM PROGRAM PLANS: LEVELS 2 AND 3 57

Figure 3-3: The Level 3 IPO diagram shows greater detailwith expanded PMP inputs and outputs.

Five distinct subprocesses are identified in this Level 3 IPO diagram:

1. Review Table Of Contents Of PMP2. Review Organization Section Of PMP3. Review Life Cycle Section Of PMP4. Review Testing Section Of PMP5. Review Schedule Section Of PMP

3.3.1'.Inputs

The primary input for this subprocess is the Program ManagementPlan, which is ready for review.

3.3.2: Internal Feedback

Since review of any document is an iterative process, review commentsmay flow to the authors on multiple occasions, depending on howmany times the Program Management Plan is revised before approval

Page 37: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

58 BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS

takes place. Thus, updated versions of the PMP (under configurationmanagement version control), based on comments from the review,may reenter this process asynchronously, triggering a repeat review.

333: Feedback to Earlier Process Steps

When issues are found in the PMP, those issues should be docu-mented and presented to the program manager, so that they can beresolved in a timely manner. Timely resolution of these issues willassist test engineers as they put together a written, high-quality testplan, as well as the other program-level plans (for example, the Confi-guration Management Plan, the Quality Assurance Plan, and theSoftware Development Plan).

3.3-4; Process Flow, Level 3 IPO Diagram

The most important sections of a Program Management Plan thatshould be thoroughly read by test engineering staff members are thetable of contents, the organization section, the life cycle section, thetesting section, and the schedule section. Though there are other sec-tions of the document that will be of interest, the cited sections deserveto be scrutinized.

Test engineers and quality assurance experts should review thetable of contents and compare it with the standards and templatesthat apply to a Program Management Plan. Why is the table of con-tents so important? One reason is that comparing it with the PMPstandards and templates will immediately alert personnel if any impor-tant sections have been omitted. For example, if you record documen-tation problems in your defect-tracking system (something I highly rec-ommend) and the table of contents shows no comparable steps, youknow you will have a tracking problem. Such omissions should bedocumented as a potential defect, and entered into the defect-trackingsystem to alert program management to the potential problem area,and then a report of the omissions should be sent to the report'sauthor as soon as possible, so that the issues can be resolved.

Next, test engineering staff should review the organization sectionof the PMP. If IEEE-Standard-1058 is followed for the PMP, the orga-nization section should identify each organization on the program,their tasks, and the interfaces between the organizations. Althoughthis list may not be fully complete, test engineering staff should reviewthe interfaces with each of the other groups on the program since

Page 38: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

3: EXTRACT TEST INFORMATION FROM PROGRAM PLANS: LEVELS 2 AND 3 59

these interfaces will influence the testing tasks, configuration control,responsibilities, and staffing sections of the test plan.

The section of the PMP that defines the development life cycle alsoshould be carefully reviewed because the program life cycle will impactthe testing life cycle. As an example, if the program schedule is basedon an iterative life cycle and the testing staff has assumed it is basedon a one-shot waterfall life cycle, the test plan will not synchronizewith the PMP, leading to innumerable problems as the program pro-gresses.

Test engineering staff members probably wrote the testing sectionof the PMP, and therefore will not need to read it again. If they did not,they need to be sure that they really are planning to do the tasks thatthis section of the PMP has allocated to them. If a manager who is notfamiliar with testing techniques, processes, and life cycles wrote thissection, there will be synchronization problems as the program pro-ceeds.

Test engineering staff members should review the schedule sectionof the PMP to ensure that there are no surprises in the schedule.Often, this section of a PMP references a Microsoft Project™ schedule;if that is the case, that schedule should be reviewed. The ТЕ staffshould ensure that all the proper work breakdown structure tasksappropriate to the program are identified on the schedule, and are inthe proper order. Although it is not likely that anyone would expectformal testing to begin prior to the start of unit testing or integrationtesting, you never can take this for granted. The testing staff alsomust ensure that the proper reviews, inspections, and walkthroughsidentified in the schedule, including review of test documentation byother staff members, have been completed.

A PMP written according to the format of IEEE-Standard-1058 willhave additional sections that will be of some interest to test engineer-ing, including the Risk Management and Change Reporting sections.However, the five subprocesses discussed here are the most importantones to be considered by personnel in test engineering, and shoulddraw the most attention.

3.3.5; Outputs

Outputs from these subprocesses include six products:

1. PMP issues2. format and testing issues (in context of the PMP)3. testing life cycle requirements (for use in the test plan)

Page 39: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

60 BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS

4. testing requirements (from both the testing and sched-ule sections)

5. Program Management Plan (reviewed by test engineeringstaff members)

6. test engineering's review of the PMP

3.3-6; Feedback from Later Process Steps

The subject of feedback has been discussed in Section 3.2. Note, how-ever, that it is quite possible that not all the inconsistencies or prob-lems with the Program Management Plan will be identified in thereview. As other problems are identified during later phases in theprogram life cycle (for example, detailed design), it may be necessary toupdate the PMP to resolve those problems and hold another review.

3.4: Review Quality Assurance Plan—Level 3 IPO Dia-gram

The Level 3 IPO diagram for subprocess 1.1.2, Review Quality Assur-ance Plan, is shown in Figure 3-4. Significant inputs are againstraightforward:

1. Quality Assurance Plan2. standards and templates

Outputs from this process follow:

1. Quality Assurance Plan (reviewed by test engineering)2. review comments (from test engineering)3. QAP issues (including format and testing issues)4. testing requirements

Five distinct subprocesses are identified in this Level 3 IPO diagram:

1. Review Table Of Contents Of QAP2. Review Standards Section Of QAP (including practices

and conventions)3. Review "Reviews" Section Of QAP4. Review CM Section Of QAP5. Review Incident-Reporting Section Of QAP

Page 40: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

3: EXTRACT TEST INFORMATION FROM PROGRAM PLANS: LEVELS 2 AND 3 61

Figure 3-4:

3.4.1: Inputs

A Level 3 IPO diagram for the Review QualityAssurance Plan subprocess shows addeddetail.

The primary input for this subprocess is the Quality Assurance Plan,which is ready for review.

Test engineering needs to be aware of the standards used on a pro-gram, including standards for documentation—and especially for test-ing documentation. This information on standards used in an organi-zation should be documented in the QAP. The testing staff shouldreview this section of the QAP, to be sure that quality assurancestaff members are using the same standards for testing documentationas are being used by test engineering personnel, and also to be awareof what standards, templates, practices, and conventions will be usedfor requirements, design, coding, and testing documentation.

Test engineering staff members should participate in reviews ofdeveloper documentation (requirements, design, code, unit test plans,and so on), to be aware of what reviews and audits QA is planning.This section of the QAP will identify the reviews and audits thatdepartment staff members will conduct in the testing areas. Individu-

Page 41: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

62 BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS

als on the testing staff need to be aware of these reviews so that theycan prepare for them and participate in them.

QA requirements regarding configuration management activitiesare also important to test engineering, since CM will most probablyprepare builds for test. Test engineering staff should thoroughlyreview this section of the QAP to make sure that QA activities in thisarea are synchronized with test engineering activities.

Finally, test engineering personnel need to be familiar with thepolicies, process, and tools used for documenting and tracking inci-dents and defects recorded on the program, since test engineering staffmembers will have to conform to these policies and processes whendocumenting and tracking incidents found during formal test. If noQuality Assurance Plan is written (possibly because the project is toosmall to need a formal plan), then information regarding incident-tracking must be documented in the test plan.

3.4.2; Internal Feedback

Since review of any document is an iterative process, review commentsmay flow to the authors on multiple occasions, depending on howmany times the QAP is revised before approval takes place. Thus,updated versions (under configuration management version control, ofcourse) of the Quality Assurance Plan, based on comments from thereview, may reenter this process asynchronously, setting off a repeatreview.

3.4.3; Feedback to Earlier Process Steps

When issues are found in the Quality Assurance Plan, those issuesshould be documented and presented to the QA manager, so that theissues can be resolved. Timely resolution of these issues will helpassure a high-quality test plan, as well as the quality of the other pro-gram-level plans.

3.4.4; Process Flow, Level 3 IPO Diagram

Principal sections of the Quality Assurance Plan that should bereviewed carefully by test engineering personnel are the table of con-tents; the standards, practices, and conventions section; the reviewsand audits section; the configuration management section; and theincident-reporting section. Other sections of this document may be of

Page 42: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

3: EXTRACT TEST INFORMATION FROM PROGRAM PLANS: LEVELS 2 AND 3 63

interest to ТЕ staff, of course, but the cited sections should bereviewed most thoroughly.

Test engineering staff members first should quickly review the con-tents listings in order to compare contents with the standards andtemplates that apply to a QAP. This will immediately alert ТЕ staffmembers whether any sections deemed significant by them have beenomitted. If you record documentation problems in your defect-trackingsystem, as I recommend, such omissions should be entered into thedefect-tracking system to alert management to the potential problemarea. If this is not an option, at the least, send a report detailing omis-sions to management as soon as possible, so that the issues can beresolved.

The next section test engineering staff members review is thestandards, practices, and conventions section of the QAP. This sectionof the plan should identify material to be used by all functions on theprogram. As well as specifying standards for documentation producedby test engineering, this section will also identify policies and proce-dures for other program activities, which can give test engineers valu-able insight into tasks other groups are performing that may impactthe way test engineers perform their job.

Reviews and audits are defined in another section of the QAP.Because members of the testing staff should attend some of thesereviews (for instance, requirements inspection), and because some ofthe reviews (for example, inspection of the test plan) will be targeted atdeliverables produced by test engineering, it is important for ТЕ per-sonnel to pay close attention. The reviews and audits specified in theQAP can provide requirements for tasks in the test plan and in thetesting schedule.

The configuration management section of the QAP will provideinformation about CM activities that can affect test engineering. Theseactivities will be documented in more detail in the Configuration Man-agement Plan, which will also be reviewed by test engineering, but theQAP contains detail test engineers need to know.

Test engineering staff also should review the incident-reporting sec-tion of the QAP to understand the policies, processes, and toolsinvolved in tracking defects and incidents on the program. Test engi-neering staff will use this system during formal test execution, so it isessential that testing staff be adept in its use, and that staff membersare knowledgeable about the policies and processes involved.

A Quality Assurance Plan written according to the format proposedby IEEE-Standard-730 will have additional sections that will be ofinterest to test engineers (see especially the Tools, Techniques, and

Page 43: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

64 BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS

Methodologies section and the Records Retention section). However,the five sections discussed here are the most valuable for test engine-ering.

3.4.5; Outputs

Five outputs are produced:

1. QAP issues2. format and testing issues3. testing requirements (for use in the test plan)4. the reviewed (and updated) QAP5. QAP review comments from test engineering

3.4.6: Feedback from Later Process Steps

Not all the inconsistencies in or problems with the Quality AssurancePlan will be identified in the review. As other problems are identifiedduring later phases in the program life cycle, it may be necessary toupdate the QAP to resolve those problems.

3.5: Review Software Development Plan—Level 3IPO Diagram

On a large program, the software engineering staff may consist of somany members and have so much work to do that it will need its ownwritten, formal Software Development Plan to support the ProgramManagement Plan. If a Software Development Plan is written, testengineers should review this plan. A Software Development Plan nor-mally will have the following sections:

1. introduction2. software life cycle3. software quality factors4. software components5. development schedule6. software engineering specification7. software quality assurance and testing8. software configuration management

Page 44: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

3: EXTRACT TEST INFORMATION FROM PROGRAM PLANS: LEVELS 2 AND 3 65

The Level 3 IPO diagram for subprocess 1.1.3, Review Software Devel-opment Plan, is shown in Figure 3-5, and has two inputs:

1. Software Development Plan2. standards and templates

Four outputs result from this process:

1. Software Development Plan after review by test engine-ering

2. review comments from test engineering3. Software Development Plan issues (including format and

testing issues)4. testing requirements

Figure 3-5: The Level 3 IPO diagram contains six SDP sub-processes to be completed by test engineeringstaff members.

Six distinct subprocesses are identified in this Level 3 IPO diagram:

1. Review Table Of Contents Of SDP2. Review Life Cycle Section Of SDP

Page 45: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

66 BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS

3. Review Quality Factors Section Of SDP4. Review Software Components Section Of SDP5. Review Schedules Section Of SDP6. Review Software Engineering Section Of SDP

3.5.1: Inputs

The primary input for this subprocess is the current version of theSoftware Development Plan, which is ready for review.

Test engineering staff members need to be aware of the standardsused to develop the Software Development Plan, so that they can com-pare the plan to the appropriate standards or templates.

3.5.2; Internal Feedback

Since review of any document is an iterative process, review commentsmay flow to the authors on multiple occasions, depending on howmany times the Software Development Plan is revised before finalapproval takes place. Thus, updated versions of the Software Develop-ment Plan (under configuration management version control, ofcourse), based on comments from the review, may reenter this processasynchronously, setting off a repeat review.

3.5.3; Feedback to Earlier Process Steps

When issues are found in the Software Development Plan, those issuesshould be documented and presented to the software developmentmanager, so that they can be resolved in a timely manner. Resolutionof these issues will assist test engineers to prepare a high-quality testplan, as well as the other program-level plans (for example, the Confi-guration Management Plan, the Program Management Plan, or theQuality Assurance Plan).

3.5.4; Process Flow, Level 3 IPO Diagram

The sections of a Software Development Plan that should be reviewedby test engineers are the table of contents, the life cycle section, thequality factors section, the software components section, the schedulesection, and the software engineering section.

Test engineers should review the table of contents and compare itwith the standards and templates that apply to a Software Develop-ment Plan. Such comparison will alert test engineering personnel to

Page 46: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

3: EXTRACT TEST INFORMATION FROM PROGRAM PLANS: LEVELS 2 AND 3 67

any omitted sections. Omissions should be documented as a potentialdefect, and entered into a defect-tracking system, if available, to alertmanagement to the potential problem area. Otherwise, a report of theomissions should be sent to department managers as soon as possible,so that the issues can be resolved.

Next, test engineering staff members review the life cycle section ofthe Software Development Plan, comparing it against the life cycle sec-tion of the Program Management Plan. The two life cycles should beidentical. As with the PMP, the program life cycle defined in theSoftware Development Plan will impact the testing life cycle. Forexample, if the software development schedule is based on a spiral lifecycle and the testing staff has assumed that a one-shot waterfall lifecycle is being used, the test plan will not synchronize with the softwaredevelopment life cycle (which in turn may be out of sync with the rapidapplication development life cycle identified in the PMP), leading to noend of problems as the program progresses.

The Software Development Plan should identify quality factors thatare considered significant during development. Some examples ofquality factors include maintainability and reliability. Knowing whichquality factors are deemed important on the program will provideinsight to test engineers, which may help them to develop a testingstrategy to verify that those quality factors are present in the deliver-able software products.

The software components section of the Software Development Plandefines components of the software deliverables that are being devel-oped. This section should be reviewed by the test engineering staff,since the software components are highly likely to become test items inthe test plan, test design, and test procedures. This sort of domainknowledge is extremely important to test engineering staff members asthey lay out their test strategy. Later, during test execution, this infor-mation may assist the test engineering staff in documenting incidentswhen software problems are detected.

Test engineering staff members should review the schedule sectionof the Software Development Plan to ensure that there are no surprisesin the schedule. Test engineering and quality assurance personnelalso should compare the schedule section in the document with thatcontained in the PMP, and determine whether the two schedules syn-chronize. Often, this section of a Software Development Plan refer-ences a Microsoft Project™ schedule; if that is the case, that scheduleshould be reviewed. The testing staff should ensure that all the properwork breakdown structure tasks appropriate to software engineeringare identified on the schedule, and are in the proper order—for exam-

Page 47: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

68 BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS

pie, formal testing should not start prior to the start of unit testing orintegration testing. Testing staff also must confirm that the properreviews, inspections, and walkthroughs, as identified in the schedulebut including development review of test documentation, have beenidentified.

Since the software engineering section of the Software DevelopmentPlan will contain information regarding the software engineering envi-ronment as well as software development techniques and methodolo-gies, test engineers should review this section. Test engineering per-sonnel should be able to obtain some information regarding compo-nents of the test environment from this section. In addition, the devel-opment techniques and methodologies may provide information ondevelopment tools that can be used to assist in formal testing.

3.5.5; Outputs

Outputs from these subprocesses are listed below:

1. Software Development Plan issues2. format and testing issues3. testing requirements4. reviewed Software Development Plan (after review by

test engineering)5. test engineering's comments on the Software Develop-

ment Plan

3.5.6: Feedback from Later Process Steps

Test engineering personnel may not be able to identify all inconsisten-cies, omissions, and problems with the Software Development Planduring this review, prior to writing the test plan. As other problemsare identified during the later phases in the program life cycle (forexample, during detailed design) and in comparable phases in the test-ing life cycle (for example, during test design), it may be necessary tomake updates to the Software Development Plan to resolve problemsas they are discovered.

Page 48: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

3: EXTRACT TEST INFORMATION FROM PROGRAM PLANS: LEVELS 2 AND 3 69

3.6: Review Configuration Management Plan—Level3 IPO Diagram

The Level 3 IPO diagram for subprocess 1.1.4, Review ConfigurationManagement Plan, is shown in Figure 3-6. There are two inputs forthis subprocess:

1. Configuration Management Plan2. standards and templates

There are four outputs from this process:

1. Configuration Management Plan reviewed by testengineering

2. review comments from test engineering3. CMP issues, including format and testing issues4. testing requirements

Figure 3-6: The Level 3 IPO diagram contains six CMP sub-processes for test engineers to review.

Page 49: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

70 BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS

Six distinct subprocesses are identified in this Level 3 IPO diagram:

1. Review Table Of Contents Of CMP2. Review Management Section Of CMP3. Review CM Activities Section Of CMP4. Review Tools Section Of CMP5. Review Records Section Of CMP6. Review Storage and Delivery Section Of CMP

3.6.1: Inputs

The primary input for this subprocess is the Configuration Manage-ment Plan, which is ready for review. The testing staff needs to beaware of the standards used to develop the Configuration ManagementPlan, so that the plan can be compared against the appropriatestandards or templates. IEEE-Standard-828, supported by IEEE-Standard 1042, can be used.

3.6.2; Internal Feedback

Since review of any document is an iterative process, review commentsmay flow back to the authors on multiple occasions, depending on howmany times the CMP is revised before approval takes place. Thus,updated versions (under configuration management version control) ofthe CMP may reenter this process asynchronously, setting off a repeatreview by test engineering (as well as by other interested groups or per-sonnel).

3.6.3: Feedback to Earlier Process Steps

When issues are found in the CMP, those issues should be docu-mented and presented to the configuration manager, so that thoseissues can be resolved promptly. Development of a high-quality testplan, as well as the other program-level plans (the Software Develop-ment Plan or the Program Management Plan, for example), will beassisted by timely resolution of such issues.

3.6.4: Process Flow, Level 3 IPO Diagram

The most significant sections of a Configuration Management Plan thatshould be reviewed by test engineering are the table of contents; themanagement section; the configuration management activities section;

Page 50: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

3: EXTRACT TEST INFORMATION FROM PROGRAM PLANS: LEVELS 2 AND 3 71

the tools, techniques, and methodologies section; the records collectionand retention section; and the storage and delivery section of the pro-gram media section. Though there are other sections of this documentthat may be of interest, the sections listed here should be reviewedmost thoroughly.

Test engineering should review the table of contents and compare itwith the standards (for example, IEEE-Standard-828) and templatesthat apply to a CMP. The test engineering staff should immediatelyalert its managers to any discrepancy or omission. Such omissionsshould be documented as a potential defect, and entered into thedefect-tracking system to alert management to the potential problemarea. Otherwise, a report of the omissions should be sent to the con-figuration manager as soon as possible so that the issues can beresolved.

Next, test engineering personnel should review the managementsection of the CMP, which should document CM staff responsibilitiesas well as the top-level policies that the CM group will administer onthis specific program. Since these responsibilities and policies willimpact test engineering, its staff needs to be aware of and comfortablewith the responsibilities and policies.

The CM activities section of the CMP should document configura-tion identification methods, configuration status accounting, configu-ration verification, CM audits and reviews, and the structure of the CMlibraries. Again, test engineering staff members need to be aware ofand comfortable with the activities and techniques used by CM, sincethere will be frequent interfaces between CM and the testing staff.

The tools, techniques, and methodologies section of the CMPshould be reviewed by test engineering because each tool or techniqueor methodology can impact the interfaces between the CM group andtest engineering. Use of the tools and methodologies by CM may wellaffect how long it will take test engineering personnel to get an updatedbuild from CM. Also, if CM is planning to inject new builds on a dailybasis and test engineering is planning to receive a new build onlyweekly, the interface will not be synchronized and will affect the pro-gram schedule.

Test engineering staff should review the records section of the CMPso that the engineers will know where and how to find informationregarding previous builds and change requests.

Test engineering staff members also should review the storage anddelivery section of the Configuration Management Plan so that they willknow where and how to obtain copies of updated builds, updated pro-

Page 51: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

72 BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS

gram documentation, and previous and current issues of the testdocumentation.

3.6.5: Outputs

Outputs from these subprocesses include the following:

1. Configuration Management Plan issues2. format and testing issues3. testing requirements (for use in the test plan)4. the CMP review results (from test engineering)5. comments on the CMP (from test engineering)

3.7: What's Next?

The following chapter presents details in the Level 2 and Level 3 IPOdiagrams for process 1.2, Create Test Plan.

Page 52: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

Index

Ahern, Dennis, 54n., 83, 199, 202, 203, 261,275

American National Standards Institute (ANSI),205, 206, 214, 222, 230, 231, 239

American Society for Quality, 186, 223Analysis, 4, 41, 121, 150, 209, 241

object-oriented, 121, 125requirements, 4, 6, 9, 11, 15, 19, 46, 74,

76, 78ff., 103, 120, 124, 182, 191,218, 270

risk, 41, 104, 187, 205structured, 24, 121, 125

Argila, Carl, 35, 279

Beck, Kent, 5, 181, 275Beizer, Boris, 116, 275Black, Rex, xxiv, 275Black-box testing, 6, 35, 44, 45, 87, 99, 128,

270, 271Boehm, Barry, 4, 275Booch, Grady, 278Bugs, xxii, 270, 271

Cantor, James, 181, 276Capability Maturity Model (CMM), xviii, xix, xx,

xxi, 3, 11, 13, 19, 31, 76, 111, 175,199-202, 235, 260-61, 267, 268levels of, 20, 200

Capability Maturity Model Integration (CMMI),xix, xxi, 54, 77, 80, 87, 91, 98, 111, 135,136, 175, 177, 199, 202-3, 261, 267

Capability Maturity Model-Software (CMM-SW),xx, 13, 20, 31, 87, 92, 98, 111, 176, 177,199, 261, 267

Carnegie Mellon University, xix, xx, 199-202, 269

Central processing unit (CPU), 36, 145, 180,242, 267

Change Control Board (CCB), 13, 233, 267Change request, 27, 71, 104, 231Clouse, Aaron, 275Coding, xvii, xix, xxiv, 4, 7, 11, 15, 17, 42, 45,

61, 89, 122, 186, 200, 218Commercial off-the-shelf (COTS), 105, 116, 117,

129, 140, 267Communication, xviii, 14, 17, 34, 154, 176, 179,

181, 182, 183, 185, 207, 245, 260Computer-aided software engineering (CASE),

125, 214, 267, 270Computer software component (CSC), 235, 267Computer software configuration items (CSCIs),

213, 221, 229, 230, 235, 238, 267tools, 125, 214

Configuration control, 133, 211, 226, 234, 244,247, 254

Configuration management, 12, 14, 27, 31, 36,37, 43ff., 55, 58, 60, 61, 62, 64, 66, 70, 71,74, 76, 83, 84, 85, 88, 89, 105, 109, 113,129, 140, 143, 156, 171, 172, 173, 177,201, 203, 219, 220, 225, 229ff., 254, 255,258ff., 267group, 129, 136, 172, 175, 234, 235, 254,

258, 259implementation of, 233, 237system, 75, 107, 109, 113, 115, 118, 119,

120, 123, 124, 126, 127, 128, 130,134, 135, 146, 147, 148, 150, 155,156, 158, 161, 164, 170, 171, 172,173, 191, 192

Configuration Management Plan (CMP), xviii, xxi,29, 30, 52, 53, 54, 55, 58, 63, 66, 69-72,

281

Page 53: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

282 INDEX

180, 182, 185, 190, 211, 219, 222,229-37, 249, 267outline, 237sections of, 70-71

Constantine, Larry L., xxiun., 280Contract Deliverable Requirements List (CDRL),

xvi, 32, 55, 210, 217, 267Cost, xuii, xviii, xix, 6, 16, 17, 21, 22, 76, 100,

104, 108, 125, 178, 185, 199, 207, 210,212

Craig, Rick, 144, 186, 187, 261, 276Crispin, Lisa, 183, 261, 276Customer, xvii, xxii, 5, 6, 7, 11, 12, 17, 19, 21,

23, 45, 46, 48, 51, 79, 80, 85, 87, 88, 89,104, 105, 108, 109, 130, 153, 154, 158,180ff., 186, 200, 205, 206, 210, 213, 214,217, 221, 222, 226, 229, 231, 239, 241,244, 245, 270

Customer Information Control System (CICS),200, 267

Data, 43, 100, 116, 134, 135, 137, 138, 140,142, 143, 146, 150ff., 157, 157, 162, 193,225, 226, 235, 245, 247, 263, 264, 271

Databases, 117, 118, 120, 124, 129, 148, 149,150, 216, 242, 264

Data flow diagrams, 24, 35, 268, 270Data item descriptors (DIDs), 32, 267Defects, xxi, xxii, 18, 19, 21, 22n., 25, 31, 36,

46, 47, 49, 56, 58, 62, 67, 71, 76, 86, 97,98, 99, 122, 126, 137, 148, 176, 177, 195,196, 203, 257, 264, 270, 271. See alsoFailure; Incidents.handling, 36, 56tracking, 46, 47, 58, 62, 67, 71

DeMarco, Tom, xxiun., 24, 35, 86, 177, 195,276

Department of Defense (DoD), xv, xvi, 15, 31,130, 199, 200, 268

Derr, Liz, 181, 276Design, xvii, xviii, xx, 4, 6, 7, 9, 10, 11, 13, 15,

17, 21, 22, 23, 26, 28, 34, 39ff., 56, 61,67, 68, 73, 79, 80, 82, 85, 86ff., 93, 94,95ff., 104, 105ff., 114ff., 119ff., 127, 128,132ff., 137, 138, 140, 142, 144, 145, 147,151, 154, 160ff., 176ff., 182, 185, 188,189, 192, 193, 200, 206, 208, 209, 213,214, 217, 218, 223, 231, 234, 240, 242,243, 247, 254, 255, 259, 270, 273object-oriented, 121specification, 27, 45, 99, 107, 115, 117,

140, 148, 150, 168Developers, xix, 15, 18, 33, 101, 109, 121, 122,

149, 180, 181, 183, 194, 226, 245, 261,262, 264, 270

Development, xviii, 6, 7, 18, 19, 22n., 30, 40,43, 67, 68, 88, 98, 108, 117, 118, 176,177, 179, 182, 196, 205, 209, 224, 271,272engineers, 19, 36managers, 19, 196schedule, 209, 216, 220

Development life cycle, 7, 38, 59, 100, 145, 270Documentation, xviii, xix, xx, xxiv, 7, 10, 15, 17,

18, 21, 25ff., 31, 36, 38, 39, 40, 43, 44,46, 47-50, 51, 59, 61, 63, 68, 72, 73, 77,

84, 86, 87, 88, 91, 94, 96, 97, 98, 104,106, 109, 114, 119, 120, 127, 128, 130,131, 132, 135ff., 141ff., 150ff., 158, 159,160ff., 175, 176ff., 183, 186, 188, 189,190, 192, 193, 196, 211, 223, 224, 228,229, 230, 231, 233, 234, 235, 236, 239,242, 243, 254, 257, 259, 271, 273

Dustin, Elfriede, 117, 261, 264, 265, 276Dymond, Ken, 200, 202, 277

Eastman Kodak, xvi, 148, 201E-mail, 91, 92, 111, 112, 145, 153, 172, 173End user, 5, 6, 205, 213, 221, 229Engineering, 7, 15, 17, 27, 31, 32, 37, 38, 40ff.,

49, 53, 55, 56, 59ff., 77, 80, 85, 87ff., 96,99, 100, 101, 103, 107, 109, 112, 115ff.,121, 122, 125, 128, 129, 130, 135ff., 143,146, 150, 153, 154, 156, 158, 163, 166,167, 169, 170, 172, 173, 176, 179, 182,195, 196, 201, 208, 232, 250ff., 262, 263,269, 270, 271, 273development, xviii, 37hardware, 15software, 15, 23, 29, 40, 45, 48, 55software quality, xviiitest, 15, 16, 18, 19, 24, 25, 33, 34

Estimating, xvi, xvii, xx, xxiv, 3, 15, 19, 117,174, 185, 186, 191, 194, 209, 210

Extreme Programming (XP), 5, 11, 45, 121,181-83, 184, 269four values for, 181

Failure, 15, 45, 117, 134, 148. See alsoDefects.

Federal Drug Administration (FDA), 6, 98, 100,130, 268

Federally funded research and developmentcenter (FFRDC), 199, 268

Feedback, 66, 68, 70, 76, 80, 86, 89, 91, 93,97-98, 102, 103, 105, 107, 109, 111, 113,115, 118, 121, 122, 124-25, 126, 128,131, 135, 137, 139, 141, 143, 144, 146,148ff., 155, 157, 159, 162, 164ff., 169,170, 172, 173

Formal Testing Process Model, xxi, 4, 2Iff . ,28ff., 173, 174, 181, 202how to use, 17ff.

Freedman, Daniel P., 32, 77, 92, 100, 112, 177,277

Functional configuration audit (FCA), 31, 268,271

Cause, Donald C., xvii, 16, 277Gelperin, David, 35, 174, 277Graphical user interface (GUI), 24, 41, 99, 107,

116, 128, 149, 150, 177, 180, 262, 263,264, 268, 270

Halstead, Maurice H., 107, 277Halstead Complexity, 107Hardware, xiii, xv, xxiii, llff., 31n., 34, 45, 47,

55n., 83, 87, 116, 117, 136, 139, 143,155, 156, 158, 178, 200, 206, 208, 211,215, 218, 223, 231, 240, 241, 245, 268,272

Hetzel, Bill, 6, 144, 174, 277

Page 54: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

INDEX 283

House, Tip, 183, 261, 276Humphrey, Watts S., xvi, xxiii, 19, 21, 199, 202,

277

lannino, Anthony, 279IBM, 21, 199, 200, 267IEEE. See Institute of Electrical and Electron-

ics Engineers (IEEE).Implementation, 4, 129, 175, 183, 206, 209,

210, 215, 216, 223, 231, 238, 240, 272Incidents, 146ff., 161, 165, 166, 168, 181ff.,

186, 188, 191, 192, 205, 211, 226, 234,243, 257, 263, 271. See also Defects;Failure,analysis, 150handling system, 15, 34, 36, 75, 83, 85,

103, 234, 262report, 155, 193, 211, 226, 234, 263review, 150severity levels, 149-50, 158tracking, 136, 143, 146, 148, 149, 150,

155, 157, 163, 182, 226, 234, 257,260

Input-Process-Output (IPO), xui, xxii, xxiii, xxiv,21-23, 24-26, 27, 28ff., 51ff., 73ff., 97,132ff., 176, 179, 184, 188ff., 193, 268Level 0, 24-25, 188, 189Level 1, 25-26, 27, 28ff., 51, 73, 94, 132,

160, 189, 190, 191Level 2, 27, 50, 51ff., 73ff., 94ff., 132ff.,

160ff., 190, 191, 193Level 3, 27, 51ff., 73ff., 94ff., 132ff., 160ff.,

190, 191Inspections, xxii, 7, 13, 14, 21, 31, 32, 37, 38,

42, 59, 63, 68, 77, 85, 92, 100, 101, 112,122, 130, 172, 177, 186, 210, 217, 218,219. See also Review; Walkthroughs.

Institute of Electrical and Electronics Engineers(IEEE), xiv, xv, xvi, xxiii, xxv, 7, 16, 18, 19,30, 31, 34, 35, 36, 39ff., 54, 55, 58, 59,63, 70, 71, 74ff., 85, 87, 98ff., 106, 107,108, 114, 115, 126, 127, 128, 133, 137,169, 170, 174, 176, 179, 185, 187, 188,205, 206, 214, 222, 223, 230, 231, 239,252, 254, 268ff., 277, 278. See also Stan-dards.

Intelligent character recognition (ICR), 117, 268International Software Testing Institute (ISTI),

184, 268

Jacobson, Ivar, 4, 180, 269, 278Jaskiel, Stefan, 186, 187, 261, 276Jones, T. Capers, 21, 86, 177, 278

Kan, Stephen, 21, 278Kerth, Norman L., 191, 195, 278Key process areas (KPAs), 13, 20, 87, 92, 98,

111, 200, 201, 203, 268KSLOC, 11, 174, 268

Lines of source code, 11, 129, 174. See alsoThousand lines of source code (KSLOC).

Lister, Timothy, 86, 177, 195, 276

Management, xui, xviii, 7, 16, 17, 34, 37, 38,63, 67, 70, 71, 92, 98, 112, 133, 136, 137,141, 149, 154, 178, 183, 205, 210, 213,

217, 221, 224, 229, 232, 237, 240, 257,271

Managers, xui, xviii, 10, 19, 21, 30, 50, 77, 87,88-89, 91, 92, 103, 111, 112, 137, 140,145, 146, 152, 154, 156, 158, 183, 193,194-95, 202, 226, 231, 271

Marick, Brian, 261, 278McCabe, Thomas J., 99, 107, 114, 278McCabe Cyclomatic Complexity, 99, 107, 114

measure, 114McCabe'sACT, 41McDiarmid, Douglas, 276McMenamin, Stephen M., xxiim., 279Meeting, 91, 92, 101, llOff., 152, 153, 154,

182, 184, 185, 191, 257, 258agenda, 153data pack, 136ff., 152, 153minutes, 101, 112, 134, 137, 140, 141,

154, 192pass/fail decision, 155, 157posttest, 133, 135, 137, 151-55, 157, 179,

181, 183, 184, 186, 192, 257, 258pretest, 133, 135, 136, 137-41, 151, 152,

179, 181, 183, 186, 192, 257report, 110, 138, 141, 151ff., 191review, 111, 140, 153, 182

Meredith, Denis, 86Metrics, xuii, 87, 145, 158, 211, 213, 226, 228

quality, 226Microsoft, 30, 32, 41, 45, 59, 67, 101, 103, 104,

115, 117, 178, 180, 191, 262Musa, John, 15, 279Myers, Glenford J., xvii, 279Myers, Ware, 19, 46, 279

Narayanan, R. Sankara, 136, 144, 264, 279Network, 36, 105, 116, 144, 270

Okumoto, Kazuhira, 279Optical character recognition (OCR), 268Organization process definition, xix, 20

Page-Jones, Meilir, xxiim., 279Palmer, John F., xxiun., 279Perry, William E., xv-xvi, 35Personal computer (PC), 36, 114, 117, 261, 268Personnel, 33-35, 141, 145, 153, 195, 225,

226, 232, 234Physical configuration audit (PCA), 268, 271Process, xv, xvi, 33, 38, 42-43

definitions, xvdocumenting a, xui

Process 1.1, Extract Test Information From Pro-gram Plans, 28, 32, 51-72, 73, 94, 132,160, 179, 182, 185, 189, 190, 249-251

Process 1.2, Create Test Plan, 28, 30, 34, 40,51, 72, 73-93, 94, 95, 96, 99, 103, 111,132, 160, 180, 182, 185, 189, 190,252-253

Process 1.3, Create Test Design, Test Cases,Test Software, And Test Procedures, 28,39, 40, 43, 47, 51, 73, 93, 94-131, 132,160, 182, 185, 189, 190, 254-256

Process 1.4, Perform Formal Test, 28, 43, 47,49, 51, 73, 94, 101, 131, 132-159, 160,183, 189, 190, 192, 257-258

Page 55: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

284 INDEX

Process 1.5, Update Test Documentation, 28,47, 51, 73, 94, 132, 137, 159, 160-173,183, 186, 189, 191, 259

Process flow, 54, 58, 62, 66, 70, 76, 80, 87, 91,98, 103, 107, 111, 115, 121, 125, 129-30,135, 144-46, 149ff., 157-58, 163-64, 166,169-70, 172-73

Process improvement, xix, xx, xxi, xxiii, xxiv, 14,18, 77, 80, 191, 193, 195, 200, 202, 269

Process model, xx, 11, 16-17, 26, 27, 29, 149,173, 182, 184, 185, 188-93, 194Extreme Programming and, 181-83organizational considerations of, 11-15scope of, xxiiitailoring, xxiii, 5, 174ff., 191, 194

Program life cycle, 59, 68, 78, 89Program management, 80, 137, 139, 153, 173,

177, 195Program Management Plan (PMP), xviii, xxi, 29,

30, 51-52, 53ff., 64, 66, 67, 70, 77, 80,182, 185, 190, 204ff., 221, 249, 250, 268outline, 212schedule, 59, 77

Program manager, 11, 13, 14, 58, 81, 140, 141,154, 164, 183, 193ff.

Programming, 7, 10, 34, 119, 193, 225Program plans, 28ff., 36, 37, 5Iff., 85, 87, 94,

160, 179, 185, 189, 190, 191, 249extracting information from, 5Iff., 185review of, 29, 32, 33test plan and, 36

Project management, 54, 205, 207, 212, 214,216, 223, 231, 239, 268

Putnam, Lawrence H., 19, 279

Quality, 64, 67, 196, 216, 224, 226Quality assurance (QA), xviii, xix, 13, 14, 15,

29ff., 55, 58, 64, 85, 92, 98, 112, 124,170, 176, 177, 182, 193, 195, 201, 203,208, 211, 268, 272. See also Softwarequality assurance (SQA).analysts, 76, 98department, 112experts, 58group, 112, 175, 193personnel, 141, 195plan, 31, 33requirements, 29, 62staff, 14, 31, 37, 61, 170, 177, 182

Quality Assurance Institute (QAI), xiii, xxiii, 16,35, 184, 186, 268

Quality Assurance Plan (QAP), 29, 30, 36, 52,53, 54, 55, 56, 58, 60-64, 66, 182, 185,190, 249, 268. See also Software QualityAssurance Plan (SQAP).outline, 228

Quality function deployment (QFD), 268Quality Systems International (QSI), 79, 101,

268

Radatz, J.W., 21, 22n.Rashka, Jeff, 276Request for proposal (RFP), 183-84, 268Requirements, xiv, xvii, xviii, xxii, 4, 6, 7, llff.,

22ff., 29, 30, 33ff., 39, 40, 42, 44, 46, 48,49, 51, 53, 54, 56, 59, 60, 61, 63ff., 68,

69, 72ff., 78ff., 80, 83, 91, 92, 96, 97,99ff., 115ff., 120ff., 134, 135, 142, 144ff.,154, 161ff., 176ff., 200, 203, 205, 206,208, 209, 211, 214, 218, 223ff., 229, 231,233, 234, 238ff., 247, 249, 252, 260, 261,270, 271, 272ambiguous, 81, 83, 103, 105analysis, 4, 6, 9, 11, 15, 19, 46, 74, 76,

78-83, 103, 120, 124, 182, 191, 218,270

development of, 16, 80, 120documentation, 35, 38, 42, 165management, 83, 201phase, 54, 116, 121, 125problems, 37-38, 46, 80, 148review, 7, 9, 15, 16, 17, 76, 80, 81, 82, 85,

125, 145, 155, 179ff., 185, 194, 195,196, 260, 261

specification, 7, 27, 33, 37, 45, 82, 103,104, 117, 135, 140, 148, 150, 166,168, 238

validation, 80, 87, 98, 226verification, 7, 226volatility, 103, 104

Review, 18, 31, 34, 37, 38, 41, 46, 48, 49, 53,55ff., 76ff., 85, 88, 89, 91ff., 102ff.,109-13, 115, 118, 119, 122, 125, 128,130, 133, 136ff., 144, 145, 151ff., 161,163, 164, 166ff., 176ff., 185, 186, 190,191ff., 203, 209, 210, 217, 219, 225, 227,228, 233, 234, 237, 250ff., 259, 260, 261documentation, 184, 188meeting, 91, 92, 100, 138, 139, 172, 176,

178, 182, 185, 227, 253peer, xix, 92, 111, 145, 177, 191, 195,

201, 202, 203, 217Risk, 23, 24, 26, 34, 35, 37, 41, 74, 75, 76, 84,

85, 86, 89, 104, 106, 108, 122, 126, 177,179, 180, 185, 188, 189, 191, 195, 210,239, 242, 245, 248, 273analysis, 41, 104, 187, 205assessment, 217, 242determining, 37documentation, 108management, 31, 37, 59, 86, 177, 179,

195, 210, 217mitigation, 41, 86, 108, 177

Rumbaugh, James, 278

Schedule, 12, 29, 30, 34, 41, 43ff., 56, 58, 59,60, 64ff., 71, 75, 77, 80, 84, 85, 88, 89,91, 103, 104, 108, 111, 125, 130, 134,135, 138ff., 140, 145, 146, 149, 151, 154,163, 176, 180, 186, 191, 194, 195, 199,200, 205ff., 214ff., 222, 223, 231, 233,234, 240, 244, 245, 248, 273

Software capability evaluation (SCE), 201, 269Software components, 64, 114, 216, 220, 222,

229, 230, 233, 245, 272Software Configuration Management Plan

(SCMP), 237, 241, 249, 250, 269. See alsoConfiguration Management Plan (CMP),outline, 237

Software development, xvii, xviii, xix, xxii, 3, 16,17, 31, 34, 37, 42, 45, 46, 67, 74, 76, 80,85, 88, 89, 91, 98, 99, 103, 107, 109, 111,

Page 56: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

INDEX 285

115, 121, 125, 128, 133, 135, 136, 140,143, 144, 145, 150, 153, 154, 158, 172,177, 178, 205, 206, 209, 217, 218, 233,260

Software development life cycle (SDLC), xvi,xviii, xix, xx, 3, 4-6, 9, 10, 35, 45, 64, 67,99, 107, 115, 121, 205, 125, 126, 181,189, 205, 209, 214, 216, 226, 230, 234,269phases of, 9, 125, 126, 231, 234

Software Development Plan (SDP), xviii, xxi, 29,30, 52ff., 58, 64-68, 70, 77, 182, 185,190, 213-20, 230, 242, 249, 250, 269outline, 220sections of, 64, 66-67

Software engineering, xix, 11, 13, 30, 31, 55,64, 67, 175, 218, 219, 220, 272

Software Engineering Institute (SEI), xuiii, xix,xx, 3, 19, 20, 31, 54n., 83, 92, 175,199-203, 235, 269

Software Engineering Process Group (SEPG),14, 176, 269

Software Process Improvement and CapabilitydEtermination (SPICE), 202, 269

Software quality assurance, 14, 30, 55, 133,136, 144, 153, 154, 156, 158, 172, 218,220, 222, 225, 227, 234. See also Qualityassurance.

Software Quality Assurance Plan (SQAP), xuiii,xxi, 218, 221-28, 241, 242, 249, 250, 269outline, 228

Software Quality Engineering (SQE), xxiii, 6, 16,35, 174, 184, 186-87, 269, 271

Software requirements specification (SRS), 7,19, 26, 27, 35, 76, 77, 79, 82, 85, 103,104, 107, 168, 269

Software testing process, xiii, xvi, xviiievaluating, 249-59

Source code, 25, 39ff., 45, 48, 97, 122, 134,135, 230, 234

Spiral model, xiv, 4, 67Standards, xvi, xx, xxiii, xxv, 18, 26, 30, 31,

34ff., 48, 49, 53ff., 69, 70, 71, 74ff., 83,85, 87, 96ff., 105ff., 113, 114, 115, 118ff.,133, 137, 155, 156, 157, 161, 162, 167ff.,174, 176, 179, 184, 185, 189, 191, 192,194, 222, 225, 228, 230, 233, 239, 240,241, 252, 254, 268ff.Standard-610.12, xxv, 7, 34, 39, 40, 99,

205, 206, 214, 222, 230, 239, 271ff.,277

Standard-829, xiv, xvi, xx, xxiii, 18, 34, 36,40, 41, 42, 74, 75, 77, 85, 87, 98,100, 101, 106, 107, 108, 114, 115,126, 127, 128, 133, 137, 158, 169,170, 176, 179, 185, 239, 252, 254,273, 278

Statement of work (SOW), 13, 206, 233, 240,269

Structured analysis, 24, 121Structured query language (SQL), 129, 146, 269Subject-matter expert (SME), 12, 92, 100, 112,

117, 172, 177, 179, 185, 261, 262, 263,264, 269

System requirements specification (SRS), 74, 87

Systems administration, 46, 85, 88, 125, 129,139, 140, 143, 150, 179, 181, 261, 262,263

Systems engineering, xviii, xix, 3, 12, 13, 103,207

System Test Plan (STP), 238ff., 269outline, 247

System under development, 122, 126System under test, xxv, 87, 99, 149, 153

Test, xu, 4, 22, 41, 61, 64, 103, 209, 223, 231,233, 234, 239, 240acceptance, 218alpha, 270automation, xxi, 177, 182, 183, 185beta, 6, 270black-box, 6, 35, 44, 45, 87, 99, 128, 270,

271buddy, 270cases, xvii, xviii, xxv, 7, 10, 15, 18, 24ff.,

34, 39-43, 44, 47ff., 73, 82, 87, 93,94, 95ff., 105ff., 113ff., 120ff., 132,135, 137, 139ff., 151, 153, 154, 157,158, 160ff., 176, 179, 180ff., 189,190, 192, 193, 242, 243, 247, 255,259, 263, 273

component, 30, 87, 129, 139, 190, 271data, 43, 100, 116, 134, 135, 137, 138,

140, 142, 143, 146, 150ff., 157, 162,193, 247, 271

databases, 100, 105, 129, 144design, xvii, xviii, xx, 7, 10, 15, 21, 28, 34,

39ff., 44ff., 67, 68, 73, 82, 86ff., 93,94, 95ff., 104, 105ff., 114, 115,119ff., 132, 133, 135, 137, 138, 140,145, 151, 160ff., 176, 178, 179, 180,182, 185, 189, 190, 192, 193, 242,243, 247, 254, 255, 259, 273

documentation, xviii, xix, xx, xxiv, 7, 10,15, 17, 18, 21, 25ff., 31, 36, 38, 39,40, 43, 44, 46, 47-50, 51, 59, 61, 63,68, 72, 73, 77, 84, 86, 87, 88, 91, 94,96, 97, 98, 104, 106, 109, 114, 119,120, 127, 128, 130, 131, 132, 135ff.,141ff., 150ff., 158, 159, 160ff.,176ff., 183, 186, 188, 189, 192, 193,196, 233, 239, 242, 243, 254, 257,259, 271, 273

engineering, 15, 17, 27, 31, 32, 37, 38,40ff., 49, 53, 55, 56, 59ff., 77, 80,85, 87ff., 96, 99, 100, 101, 103, 107,109, 112, 115ff., 121, 122, 125, 128,129, 130, 135ff., 143, 146, 150, 153,154, 156, 158, 163, 166, 167, 169,170, 172, 173, 176, 179, 182, 195,196, 201, 232, 250ff., 262, 263, 269,270, 273

environment, 45, 46, 68, 75, 83ff., 103,104, 113ff., 119, 120, 123ff., 133,136ff., 150, 153, 163, 178, 180, 183,185, 186, 191, 192, 257, 259

execution, xvii, xxi, xxiii, xxiv, 36, 37, 45ff.,63, 67, 78, 81, 87, 88, 89, 96, 98,101, 102, 109, 116, 118, 119,122-23, 126, 128, 130, 131, 133ff.,

Page 57: Best Practices for the Formal Software Testing Processptgmedia.pearsoncmg.com/images/9780133488777/samplepages/... · BEST PRACTICES FOR THE FORMAL SOFTWARE TESTING PROCESS A MENU

286 INDEX

141ff., 148, 151, 153, 154, 157, 161,163, 168, 172, 176, 178, 181, 183,186, 195, 257, 259, 260ff., 273, 274

experts, 35, 45formal, 43-47, 51, 59, 68, 73, 94, 98,

132ff., 160, 164, 186, 188, 189, 190,193, 257

functional, 263-64, 265, 266gray-box, 99, 128, 270, 271, 274harness, 15, 19, 34, 36, 37, 45, 76, 85,

88, 104, 125, 139, 140, 143, 178,179

incidents, 136, 188, 243information, 28, 51ff., 94, 160, 179, 189,

190, 249input data, 40, 43, 47, 243integration, xvii, 6, 7, 13, 29, 30, 43, 45,

59, 68, 87, 118, 129, 130, 139, 150,218, 270, 271

items, 114, 115, 116, 127, 128, 129, 133,136, 138ff., 150, 156, 157, 191, 243,273

manager, xiii, xviii, 10, 19, 30, 50, 77, 87,88-89, 91, 92, 103, 111, 112, 137,140, 145, 152, 154, 156, 158, 183,193, 194-95, 202

output, 150, 151, 155, 156pass/fail criteria, 97, 108, 134, 135, 154,

180, 185, 191, 192, 244, 247, 257performance, 104, 109, 265, 266plan, xvii, xx, xxi, xxii, xxiii, xxv, 7, 10, 15,

16, 19, 21, 23, 24, 28, 30, 31, 34ff.,42, 44ff., 54, 55, 58, 64, 66, 67, 68,70, 72, 73ff., 94, 97, 99, 100, 101,102ff., 115, 127, 128, 132ff., 160ff.,176, 178, 180, 181, 182, 185ff., 191,192ff., 219, 240, 241, 243, 252ff.,259, 260, 263, 269, 271, 273

process, xvii, xviii, 3, 73, 77, 95, 135, 138,144, 147, 176, 194, 195, 196, 240

program, xvii, 24, 74, 75, 76, 79, 87, 101,128, 154, 186, 191, 194, 249, 254,273

regression, 31, 44, 46, 47, 48, 87, 108,130, 135, 136, 149, 157, 166, 169,172, 177, 178, 261, 272

report, 155-59, 192requirements, 6, 29, 30, 35, 51, 56, 60,

61, 64, 65, 68, 69, 72, 84, 91, 92,118, 119, 120, 180, 190, 240, 241,242, 247, 252, 270, 271

restart/recovery, 118resumption criterion, 87schedule, 43ff., 103, 134, 135, 138ff., 154,

195, 273scripts, 126, 183sequence, 118, 261, 262, 265smoke, 115, 117, 118, 261, 262status, 154, 156, 243strategy, 67, 82, 108, 273stress, 265structural, 6, 271, 274suspension criterion, 87, 244, 247tasks, 84, 87, 103, 180, 242, 247, 273team, 32, 55, 89, 114, 116, 126, 133, 137,

138, 145, 158, 163, 172, 177, 183ff.,194, 195, 196

technicians, 42, 136, 273tools, 43, 134, 135, 141, 142, 144, 183,

189, 191,243tracking, 63unit, 6, 13, 27, 29, 30, 43, 45, 59, 68, 86,

87, 88, 118, 122, 129, 133, 136, 138,139, 150, 179, 193, 218, 270, 271

user acceptance, 6, 30, 273white-box, 6, 87, 193, 270, 271, 274

Testing life cycle (TLC), xix, xx, xxiii, xxiv, 6-11,16, 38, 41, 59, 67, 108, 166, 169, 189,269components of, 7-8modeling, 22, 23, 26, 40, 44, 48, 53, 57,

61, 65, 69, 74, 75, 79, 84, 90Thousand lines of source code (KSLOC), 11,

174, 268Tools, xxi, xxiii, xxiv, 23, 26, 35, 37, 39, 40, 41,

43, 46, 70, 71, 74, 75, 84, 96, 97, 101,115, 116, 119, 123, 124, 134, 135, 141ff.,148, 177, 178, 180, 183, 189, 191, 205,211, 213, 214, 218, 221, 226, 229, 233ff.,241, 243automated, 25, 41, 46, 79, 101, 115, 119,

123, 125, 129, 141, 143, 144, 177,178, 180, 188, 270

Traceability matrix, 78, 79, 82, 226Turner, Richard, 275

Unified Software Development Process (USDP),xiv, 4, 269

U.S. Air Force Rome Air Development Center(RADC), 21, 268

Use cases, 36, 180, 182, 186, 263, 264User, 7, 27, 108, 119, 120, 121, 122, 124, 125,

126, 142, 153, 205, 207, 238, 241, 263,265acceptance testing, 27documentation, 120, 122, 124, 126instructions, 119, 124manuals, 15, 45, 143, 150, 261

V-diagram, xvi, 7, 8Vendor, 43, 124, 125, 226, 235

Walkthroughs, 13, 14, 38, 42, 59, 77, 92, 100,101, 112, 122, 172, 177, 186, 209, 210,217, 219, 254, 255, 256

Waterfall model, 8, 59, 67Weinberg, Gerald M., xvii, 16, 32, 77, 92, 100,

112, 177, 277White-box testing, 6, 87, 193, 270, 271, 274Wiegers, Karl E., xvii, xviii, 16, 32, 77, 92, 100,

112, 177, 279Work breakdown structure (WBS), xvi, 19, 269

Yourdon, Edward, xvi, xxivn., 35, 187, 279, 280Yourdon, Inc., xxiv, 24, 270YOURDON Systems Method (YSM), 24, 26, 270