T-76.5613 Test documentation and Reporting 2006

43
Juha Itkonen SoberIT HELSINKI UNIVERSITY OF TECHNOLOGY T-76.5613 Software Testing and Quality Assurance 23.10.2006 Test documentation and Reporting

Transcript of T-76.5613 Test documentation and Reporting 2006

Page 1: T-76.5613 Test documentation and Reporting 2006

Juha ItkonenSoberIT

HELSINKI UNIVERSITY OF TECHNOLOGY

T-76.5613 Software Testing and Quality Assurance

23.10.2006

Test documentation and Reporting

Page 2: T-76.5613 Test documentation and Reporting 2006

2Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

IEEE 829 Standard for Software Test Documentation

Test PlanningTest plan

Project levelPhase level

Test SpecificationTest design SpecificationTest case specificationTest procedure specification

Test ReportingTransmittal reportTest logIncident reportTest summary report

Number of needed test documents, their format, thoroughness and level of detail depends on context.

Page 3: T-76.5613 Test documentation and Reporting 2006

3Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

IEEE 829 Standard for Software Test Documentation

Test PlanningTest plan

Project levelPhase level

Test SpecificationTest design SpecificationTest case specificationTest procedure specification

Test ReportingTransmittal reportTest logIncident reportTest summary report

Page 4: T-76.5613 Test documentation and Reporting 2006

4Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Test plan – a project plan for testing

A document describing the scope, approach, resources, and schedule of intended testing activities.It identifies test items, the features to be tested, the testing tasks, responsibilities, schedules, and risksTest plan can be either product or tool

Test plan as a product: structure, format and level of detail are determined not only what’s best for the effectiveness of the testing effort but also by what customer or regulating agency requires.Test plan as a tool: Creating long, impressive, or detailed test planning documents is not the best use of your limited time. A test plan is a valuable tool to the extent that it helps you manage your testing project and achieve your testing goals.

Page 5: T-76.5613 Test documentation and Reporting 2006

5Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Test Plan (IEEE Std 829) 1/5

1 Test plan identifier2 Introduction

Product to be tested, objectives, scope of the test planSoftware items and features to be testedReferences to project authorization, project plan, QA plan, CM plan, relevant policies & standards

3 Test itemsTest items including version/revision level

Items include end-user documentationBug fixes

How transmitted to testingReferences to software documentation

Page 6: T-76.5613 Test documentation and Reporting 2006

6Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Test Plan (IEEE Std 829) 2/5

4 Features to be testedIdentify test design / specification techniquesReference requirements or other specs

5 Features not to be testedDeferred features, environment combinations, …Reasons for exclusion

6 ApproachHow you are going to test this system

Activities, techniques and tools

Detailed enough to estimateCompletion criteria

Specify degree of comprehensiveness (e.g. coverage) and other criteria (e.g faults)

Identify constraints (environment, staff, deadlines)

Page 7: T-76.5613 Test documentation and Reporting 2006

7Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Test Plan (IEEE Std 829) 3/5

7 Item pass/fail criteriaWhat constitutes success of the testingE.g. coverage, bug count, bug rate, number of executed tests, …Is NOT product release criteria

8 Suspension and resumption criteriaFor all or parts of testing activitiesWhich activities must be repeated on resumption

9 Test deliverablesTest planTest design specificationTest case specificationTest procedure specificationTest item transmittal reportTest logsTest incident reportsTest summary reports

Page 8: T-76.5613 Test documentation and Reporting 2006

8Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Test Plan (IEEE Std 829) 4/5

10 Testing tasksIncluding inter-task dependencies & special skillsEstimates

11 EnvironmentPhysical, hardware, software, toolsMode of usage, security, office spaceTest environment set-up

12 ResponsibilitiesTo manage, design, prepare, execute, witness, check, resolve issues, providing environment, providing the software to test

13 Staffing and Training needs

Page 9: T-76.5613 Test documentation and Reporting 2006

9Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Test Plan (IEEE Std 829) 5/5

14 ScheduleTest milestones in project scheduleItem transmittal milestonesAdditional test milestones (environment ready)What resources are needed when

15 Risks and ContingenciesTesting project risksContingency and mitigation plan for each identified risk

16 ApprovalsNames and when approved

Page 10: T-76.5613 Test documentation and Reporting 2006

10Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Course Test Plan Templatefor the exercise phase 3

1. Introduction

2. Tested items and features

3. Testing Approach

4. Resources

5. Tasks and Schedule

6. Risks and contingencies

7. Approvals

The high level structure for your test planDetails below each chapter give you the idea of the contents of each chapterYou should apply information from your course book, lectures, and IEEE 829-1998 standard to make as good test plan as possible

Remember: Content or nothingGrading is not based on word count

Page 11: T-76.5613 Test documentation and Reporting 2006

11Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Test plan quality criteria

Usefulness: Will the test plan effectively serve its intended functions?Clarity: Is the test plan self-consistent and sufficiently unambiguous?Accuracy: Is the test plan document accurate with respect to any statements of fact?Adaptability: Will it tolerate reasonable change and unpredictability in the project?Efficiency: Does it make efficient use of available resources?Usability: Is the test plan document concise, maintainable, and helpfully organized?Compliance: Does the test plan meet externally imposed requirements?Foundation: Is the test plan the product of an effective test planning process?Feasibility: Is the test plan within the capability of the organization that must use it?

Source: Kaner, Bach, Pettichord. Lessons Learned in Software Testing. 2002

Page 12: T-76.5613 Test documentation and Reporting 2006

12Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

IEEE 829 Standard for Software Test Documentation

Test PlanningTest plan

Project levelPhase level

Test SpecificationTest design SpecificationTest case specificationTest procedure specification

Test ReportingTransmittal reportTest logIncident reportTest summary report

Page 13: T-76.5613 Test documentation and Reporting 2006

13Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Test Case Specification (IEEE Std 829)1. Test-case-specification identifier: specifies the unique

identifier.2. Test items: detailed feature, code module, etc. to be tested;

references to product specifications or other design docs3. Input specifications: Each input required to execute the test

case (by value with tolerances or by name); identifies all appropriate databases, files, terminal messages, etc.; specifiesall required relationships between inputs (for example, timing)

4. Output specifications: result expected from executing the test case; outputs and features (for example, response time) required of the test items; exact value (with tolerances where appropriate) for each required output or feature

5. Environmental needs: hardware, software, test tools, facilities, staff, etc.

6. Special procedural requirements: special constraints7. Intercase dependencies: lists the identifiers of test cases

which must be executed prior to this test case; the nature of the dependencies

Page 14: T-76.5613 Test documentation and Reporting 2006

14Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Test design specification

Specifies a one set of tests and refines the test plan informationSpecific methods, tools and techniques Groups together tests that test a certain set of featuresReferences to test procedures and test casesPass/fail criteria

Page 15: T-76.5613 Test documentation and Reporting 2006

15Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Test procedure specification

Used if neededTest plan and test design specification are not enough

Specifies the steps for executing a set of test casesSpecial requirements for executing the testsReferences to test casesSteps and any measurements to be made

Page 16: T-76.5613 Test documentation and Reporting 2006

16Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

IEEE 829 Standard for Software Test Documentation

Test PlanningTest plan

Project levelPhase level

Test SpecificationTest design SpecificationTest case specificationTest procedure specification

Test ReportingTransmittal reportTest logIncident reportTest summary report

Page 17: T-76.5613 Test documentation and Reporting 2006

17Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Reporting test results

Evaluation of the tested softwareFound defects and issuesTesters’ assessment of the qualityRisk assessment based on test results and gained knowledge

Comparison of planned vs. actualTesting project management report

A post mortem for tests to comeWeak and omitted areasIdeas for new testsRisks

Page 18: T-76.5613 Test documentation and Reporting 2006

18Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Test record (log) contains

Chronological record of relevant details about the tests executionIdentities and versions (unambiguously) of

Software under test (exact build, versions of components, …)Test specifications

Testing environment Identify the attributes of the environments in which the testing is conductedInclude hardware being used (e.g., amount of memory being used, CPU model, mass storage devices)system software usedresources available

Activity entriesDate and time for beginning and end of activitiesIdentity of the testerExecution descriptionResultsAnomalous eventsDefect ID:sReferences to other documents (test case, test design specification, …)

Page 19: T-76.5613 Test documentation and Reporting 2006

19Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Check the results

Follow the plan and mark off progress on test scriptNote that these records are used to establish that all test activities have been carried out as specified

Document actual outcomes from the testCapture any other ideas you have for new test casesCompare actual outcome with expected outcome.Log discrepancies accordingly

Software faultTest fault (e.g. expected results wrong)Environment or version faultTest run incorrectly

Log coverage and other planned metrics for measures specified as test completion criteria

Page 20: T-76.5613 Test documentation and Reporting 2006

20Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Defect reporting

Defect reportA technical document written to describe the symptoms of a defect to

Communicate the impact and circumstances of a quality problemPrioritize the defect for repairHelp a programmer to locate and fix the underlying fault

Defect reports are the most frequent and visible results of testing workDefect reports are an important communication channel from testing to developmentDefect reports are challenging to write

Bearing bad newsExplaining complicated behaviourCommunicating to people with different mindset using as few words as possibleGoal is to make people fix their mess instead of creating some new fancy functionality

Page 21: T-76.5613 Test documentation and Reporting 2006

21Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Reporting the found defects

Report the defects immediatelyDon’t leave it until the end of test session

Make sure the defect has not been previously reportedFind out how to reproduce the defect

Easier to isolate and get fixed

Write specific and clear defect reportsSpend some time to find out what is the actual defect and under which conditions it occursWhat were the specific expected outcome and the actual outcome

Be non-judgmental in reporting bugs. Bug reports need to be non-judgmental and non-personalReports should be written against the product, not the person, and state only the facts.

Page 22: T-76.5613 Test documentation and Reporting 2006

22Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

An effective bug description

Useful bug reports are ones that get bugs fixed!Minimal

Just the facts and details necessary An exact sequence of steps that shows the problem

SingularOnly one bug per report – only one report per bug

Obvious and generalUse steps that are easily performed and show the bug to be as general as possible and readily seen by usersIf a programmer or tester has to decipher a bug, they may spend more time cursing the submitter than solving the problem

ReproducibleIsolate and reproduce what seems like random software behaviorIf an engineer can't see it or conclusively prove that it exists, the engineer will probably stamp it "WORKSFORME" or "INVALID", and move on to the next bug.

SeverityShow clearly how severe are the consequences if this defect is delivered to operation

Page 23: T-76.5613 Test documentation and Reporting 2006

23Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

10 steps to great defect reports1. Structure

Testing must be structured, to understand what you are doing.2. Reproduce

Clear steps. Three tries.3. Isolate

Which factors affect the defect and how.4. Generalize

Try to find out the more general case when the defect occurs.5. Compare

Does the same defect exist in other versions, and other parts of the product.6. Summarize

Communicate with a single sentence the essence and significance of the defect.7. Condense

Remove any excess information. Use just the words you need and describe only the necessary steps.

8. DisambiguateRemove confusing or misleading words – be clear.

9. NeutralizeAs a bearer of bad news, express yourself calmly, don’t attack programmer or use unnecessary humour or sarcasm.

10. ReviewE.g. informal check by another tester, or pair testing.

Rex Black, 2004. Critical Testing Processes.

Page 24: T-76.5613 Test documentation and Reporting 2006

24Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Motivate fixing the defect

Make the defect look more seriousFind a credible scenario that demonstrates the impact of the defectE.g., a realistic story that describes how a user can lose data when this defect occurs

Make the defect look more generalYou discovered the defect in some specific caseWhat is the most general case the defect occursE.g., if you first found out that the system cannot cope with ‘~’ and ’\’

you might be able to generalize the defect into “the system only accepts characters ‘a-z’, ‘A-Z’ and ‘0-9’ and not any special characters including ‘-’,’ä’, and ‘ö’.”

Page 25: T-76.5613 Test documentation and Reporting 2006

25Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Defect report

1. Defect-report identifier2. Title: A short description of the defect3. Bug description: A detailed description of the

defectDate, time and finderTest item and environment including version and build numbersExpected resultsActual resultsRepeatability (whether repeated; whether occurring always, occasionally or just once)Additional information that may help to isolate and correct the cause of the incident

4. Severity of the bug

Page 26: T-76.5613 Test documentation and Reporting 2006

26Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Reporting a bug – an exercise

Suppose that you are running tests on the Windows Calculator and find following results:

1+1=22+2=53+3=64+4=95+5=106+6=134+6=104+5=9

Write a bug title and bug description that effectively describes the problem.

Page 27: T-76.5613 Test documentation and Reporting 2006

27Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Reporting a bug – one solution

Title: Adding a pair of equal even numbers gives too big result (by one)Description:

Setup: start Version 1.0 of CalculatorRepro steps: Try adding pairs of equal even number such as 2+2, 4+4, and 10+10. Also try adding pairs of equal odd numbers such as 3+3, 5+5, and 13+13 and pairs of unequal numbers such as 1+2, 4+6, and 15+13. Expected result: Correct answer for all pairs 2+2=4, 4+4=8…Actual result: For pairs of equal even numbers, the answer is one too big: 2+2=5, 4+4=9, 10+10=21 and so on. Other info: This wasn’t tried exhaustively, but the bug occurred on many instances from 2+2 to 65536+65536. The bug doesn’t seem to occur with odd numbers or unequal pairs.Environment: Windows 2000, 5.00.2195, Service Pack 4Reporter: Jack Debugger

Page 28: T-76.5613 Test documentation and Reporting 2006

28Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Test Summary Report (IEEE Std 829) 1/2

1. Test-summary-report identifier2. Summary

summarizes the evaluation of the test itemsidentifies the items tested (including their version/revision level)indicates the environments in which the testing took placesupplies references to the documentation over the testing process

3. Variancesindicates any variances of the actual testing process from the test plan or test proceduresspecifies the reason for each variance

4. Comprehensiveness assessmentevaluates the comprehensiveness of the actual testing processagainst the criteria specified in the test planidentifies features or feature combinations which were not sufficiently tested and explains the reasons for omission

Page 29: T-76.5613 Test documentation and Reporting 2006

29Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Test Summary Report (IEEE Std 829) 2/2

5. Summary of resultssummarizes the success of testing (such as coverage, numbers of defects, severities, etc.), identifies all resolved and unresolved incidents

6. Evaluationprovides an overall evaluation of each test item including its limitationsbased upon the test results and the item-level pass/fail criteria

7. Summary of activitiesthe major testing activities and eventsresource consumption (total staffing level, total person-hours, total machine time, total elapsed time used for each of the major testing activities, …)

8. Approvalsspecifies the persons who must approve this report (and the whole testing phase)

Page 30: T-76.5613 Test documentation and Reporting 2006

30Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Meetings, reports and project control issues

Test manager should collect and track the metrics regularly (weekly) and direct the testing groups efforts by status meetingsTest results must be reported regularly to the whole project group (or development team)

Depending on the used sw development model, the required pace of feedback (test results) varies from 1 day to over a month

The mission of a testing team is to produce, for the rest of the organisation, relevant information about the quality status of the system promptly and in a useful formSummary reports are provided to upper management at least in every milestone

Page 31: T-76.5613 Test documentation and Reporting 2006

31Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Status reports to management

Reports should be brief and contain (J. Rakos)

Activities and accomplishments during the reporting periodProblems encountered since the last meeting/reportProblems solvedOutstanding problemsCurrent state versus planExpenses versus budgetPlans for the next time period

What is the most important information for upper management from a testing project?

Page 32: T-76.5613 Test documentation and Reporting 2006

32Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

The most important information (my answer)

1. Evaluation of quality and status of the software development (project)

BriefEasy and quick to understandBrings clearly forth the relevant informationRisks

2. Problems that require management actions3. Status of testing versus plans

Accomplishments, coverage, defect counts and ratesExpensesRequired changes to plans

Page 33: T-76.5613 Test documentation and Reporting 2006

33Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Testing Dashboard

6.6.2003 Build 1.34.52

Functional Area Activity Coverage Quality Comments ☺ PerfectFile management low 2 ☺ HmmMain hierarchy high 2 Some weird behavior, still testing Aargghhh!Formatting high 2 Some critical bugs foundDrag and drop pause 1 Don't work with other applications 3 Its testedImports Ready 3 ☺ 2 Features checkedExports blocked 0 Can't test: no specs for exprt formats 1 We tried it (once)Overall GUI blocked 3 A lot of small bugs 0 NothingEditing area pause 2 ☺ Looks goodClipboard 0 Not deliveredProperty tables 0 Not implementedPreferences high 1 Nothing serious yet…Help pause 2 Mainly spelling and grammatic issues

Kaner et al. 2002. Lessons Learned in Software testing.

Page 34: T-76.5613 Test documentation and Reporting 2006

34Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Risk-based reporting

Res

idua

l Ris

ks

P r o g r e s s t h r o u g h t h e P l a n n e d T e s t i n g

T o d a y P l a n n e dE n d

S t a r t

A l l i d e n t i f i e d r i s k s o p e n a tt h e b e g i n n i n g

R e s i d u a l r i s k s o f r e l e a s i n gt o d a y

Gerrard, P. & Thompson, N. 2002. Risk-based E-business Testing.

Page 35: T-76.5613 Test documentation and Reporting 2006

35Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Risk based testing

Risk based test case design techniquesGoal is to analyse product risks and use that information to design good tests (test-cases)How this system could fail?

Error guessingFailure modelsExperience

Risk based test management Estimating risks for each function or featureUsing risk analysis to prioritise testing

Choosing what to test firstChoosing what to test most

Page 36: T-76.5613 Test documentation and Reporting 2006

36Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Qualitative vs. quantitative

Technical IncompatibilityVery likelyCould affect many usersSystem facilities not fully available or usable

Funding withdrawnUnlikelySevere impact

Technical Incompatibility70% likelihoodmay affect 80 % of users7 facilities could be unusable, 3 difficult to use

Funding withdrawn5 % probability95 % chance of project cancellation (5 % find other sponsors)

Page 37: T-76.5613 Test documentation and Reporting 2006

37Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Qualitative risk analysis

Low impact, likely

Low impact, mid-likely

Low impact, unlikely

Some impact, likely

Some impact, mid-likely

Some impact, unlikely

Severeimpact, likely

Severeimpact, mid-likely

Severeimpact, unlikely

Mostbenefits in addressingthese risks

NuisancesInsignificant

Nightmares

unlikely mid-likely likely

Severeimpact

Someimpact

Lowimpact

Page 38: T-76.5613 Test documentation and Reporting 2006

38Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Example – Statistical Risk Analysis Matrix

InterestCalculation

Close Account

CreateAccount

Dev Cust Avrg.NewFunc.

DesignQual. Size

Com-plexity

Weigh.Sum

RiskExposure

5 5 1 3

3 3 3

1 3 2

2 1 1,5

2 3 3 3 37

2 2 2 3 31

3 3 2 3 41

111

62

61,5

Cost * Probability = Re

Modified slide, originally from Ståle Amland

Weight

Idea is to get the features into priority order - somehow

Page 39: T-76.5613 Test documentation and Reporting 2006

39Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Collectionsan empty collection exactly one element (this is somewhat low-yield) more than one element

the maximum number of elements (if that's not reasonable, try it at least once with more than just a few elemenduplicate elements

Searching Collectionsmatch not found. (If searching a limited subset of a collection, arrange for there to be a match outside that subset. Example below.) one match (The best possible place to put it is probably just before the end bounds, if such a thing makes sense. Example below.) more than one match

Finding subsets (filtering out elements)Filtering out no elements (subset is the same as the original) Filtering out one element Filtering out all but one element Filtering out all elements (subset is empty)

Special collection elementsthe collection contains itself (redundant with next one, but the first one to try). indirect containment: the collection contains a collection that contains the original collection

Pairs of collectionsBoth collections empty First collection has one element, second has none. First collection has no elements, second has one. Both collections have more than one elements.

Test catalogs can be helpful

Page 40: T-76.5613 Test documentation and Reporting 2006

40Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

ContainersAppending to a container's contentsContainer initially empty. Container initially not empty. Adding just enough elements to fill it. A full container, adding one element. A partially full container, add one too many elements Add zero new elements

Overwriting a container's contentsNew contents have one more element than will fit New contents just fit. Zero elements used. Is the container correctly emptied? Some elements added, but fewer than in original (are old contents correctly cleared?)

Deleting elementsThe container has one element (low yield) The container is already empty.

FilesAbility to operate on the filethe file exists

it's readable it's not readable it's writeable it's not writeable

the file does not exist it doesn't exist, but it can be created it doesn't exist, and it can't be created

File typesAn ordinary file A directory/folder An alias or symbolic link (note that both exist on Mac OS X) A special file (Unix)

Page 41: T-76.5613 Test documentation and Reporting 2006

41Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

NamesWhatever the name names does not exist Two different names name the same thing.

PathnamesPOSIX / M ac OS XEmpty. (For example, in a Unix shell, give it the argument "", which is equivalent to ".".) Absolute Example: /tmp/foo Relative Example: tmp/foo

If the program is likely to take the pathname apart (to use one of its parts, or to build a new pathname): Containing the component "..", as in ../../dir/X Slash at end: foo/ No slash: foo Slash in the middle: foo/bar More than one slash: foo/bar/baz Duplicate slashes: foo//bar (should be equivalent to foo/bar)

Pathname as argument to command-line command: Pathname begins with dash. (How do you get the command to work on -file?)

PercentageIf the percentage is calculated from some Count, try to make the count be zero.

Page 42: T-76.5613 Test documentation and Reporting 2006

42Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Textual Input/QuotedUnpaired Quotes

Quote mark as first character in text. Example: \abcd Quote mark somewhere in the middle of the text. Example: ab\cd Quote mark as the last character (quoting nothing). Example: abcd\

Paired QuotesLik e double quotes (") around strings, or /* */ around Java comments. I'll use /* and */ for start-of-quote and end-of-quote. Quote everything - no unquoted text. A quote in the middle - with "real" text before and after it. Example: ab/*de*/fg No closing quote mark. Example: ab/*de Opening quote as last character in the text. Example: /* Nested quotes. Example: /*/*hi*/*/ Nothing inside the quote. (low yield) Example: /**/ Open quote without close quote. Example: /*no end Close quote without open quote. Example: no beginning */

CombinationsSingle-quoted double-quote mark. Example: \"some text without closing quote Double quote, where close-quote immediately follows a single-quote mark. Example: \/*Not quoted.

Lik e Unix's use of \ to quote many special characters, or comments that apply from comment character lik e # or // to the end of the line. In the

Page 43: T-76.5613 Test documentation and Reporting 2006

43Juha ItkonenSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY

Textual inputNothingEmpty (clear default)0LB-1 LB = Lower boundaryLB UB = Upper boundaryUBUB+1Far below LBFar above UBUB number of charsUB+1 number of charsFar beoynd UB charsNegativeNon-digit (/ ASCII 47)Non-digit (: ASCII 58Upper ASCII (128-254) charactersASCII 255 Wrong data typeExpressionsLeading zerosLeading spacesNon-printing charO/S file nameUpper ASCIIUpper caseLower caseModifiers (ctrl, alt, etc.)Function keysEdited with backspace and deleteInput while processingLanguage reserved characters