Testing Materials for Basic

27
SOFTWARE TESTING Test Engineer Responsibility Understanding requirements & functional specifications. Identifying test scenario. Writing & Executing Test cases. Log Test results (test cases pass or fail). Defect Reporting and Tracking & Fixing Defects. Test Manager Responsibility Creating Test Plan Review Quality Quality is fitness for use / Conformance to agreed functional & non functional requirements. Customers believe that the product meet their needs and expectations. Continuous Improvement for Quality (PDCA) Plan: Define the goal and the plan for achieving that goal. Do/Execute: Depending on the plan strategy decided during the plan stage we do execution accordingly in this phase. Check: Check/Test to ensure that we are moving according to plan and are getting the desired results. Act: During the check cycle, if any issues are there, then take appropriate action accordingly and revise our plan again.

description

This documents which helps to understand the basic concepts of Software testing .

Transcript of Testing Materials for Basic

Page 1: Testing Materials for Basic

SOFTWARE TESTINGTest Engineer Responsibility

Understanding requirements & functional specifications. Identifying test scenario. Writing & Executing Test cases. Log Test results (test cases pass or fail). Defect Reporting and Tracking & Fixing Defects.

Test Manager Responsibility

Creating Test Plan Review

Quality

Quality is fitness for use / Conformance to agreed functional & non functional requirements. Customers believe that the product meet their needs and expectations.

Continuous Improvement for Quality (PDCA)

Plan: Define the goal and the plan for achieving that goal. Do/Execute: Depending on the plan strategy decided during the plan stage we do execution accordingly in this

phase. Check: Check/Test to ensure that we are moving according to plan and are getting the desired results. Act: During the check cycle, if any issues are there, then take appropriate action accordingly and revise our plan

again.

Page 2: Testing Materials for Basic

Customers View Of Quality (People/Stake Holder)

Delivering the right product. Satisfying customers’ needs. Meeting customer expectations.

Treating customers with integrity, courtesy & respect.Suppliers View Of Quality ( Organization)

Doing the right things (no waste, rework) Doing it right way (Standards) Doing it right the first time Doing it on time.

Page 3: Testing Materials for Basic

User gap (customer) – Failure seen by customer interms of difference between user needs and product specifications.

Producer gap (developer) – its due to process failure at developer site .need to take improvement plans.

Financial Aspects of quality

Sales price = Cost of manufacturing + cost of quality + profit

Cost of manufacturing - Cost required for developing right product at first time.

Cost of quality - Improving or maintaining quality of a product. Cost incurred like cost of prevention (following process, standards guidelines), cost of appraisal (cost for reviews and testing) and cost of failure (finding defect any stage of development).

Bench marking

Software development models (Types of SDLC Models)

Waterfall Model (Classic Life Cycle Model / Linear Sequential Model) Spiral Model. Prototype Model. Rapid Application development Model (RAD). Iterative and Incremental development Model Agile Model.

Waterfall Model (Classic Life Cycle Model / Linear Sequential Model)

1) Systematic and sequential approach (all the activities will be carried out sequential manner).2) Output of each phase will be given as input as next phase.3) Suitable for Project where requirements are clearly defined & small and medium term duration

Spiral Model

Spiral development process assumes that customer requirements are obtained in multiple iterations and development also works in multiple iterations.

First some functionality is added and, then product is created and released to customer. After getting benefit of first iteration of implementation, the customer may add another chunk of requirements to the existing systems.

In spiral model, software is developed in a series of ‘incremental’ releases . Each spiral consists of a deliverable product.

Customer can start using the system after every spiral .Each spiral consists of waterfall model.

Prototype model

Identify - It identifies the basic requirement. Initial prototype – A prototype is developed based on the initial understanding of the customer requirements. The initial prototype dev eloped consists of the use r interface s only. Review - The customers, including end-users, examine the prototype and prototype feedback on additions

or changes. Revise and Enhance the prototype Using the feedback both the specifications and the prototype can be

improved . “A visible working prototype helps customer to define the requirements”

Advantages

It can be used when customer is not sure about what he wants. It is a faster way of finalizing t he requirements

Page 4: Testing Materials for Basic

Rapid Application development Model (RAD)

RAD enables creation of fully functional software with in a very short time. If the requirements are well defined and understood, and the project scope is constraint , the RAD process

enable s a development team to create a fully functional system within a very short time period.

Iterative and Incremental development Model

Iterative models assume that changes come from any phases to any previous phases.Changes may have cascading effect where one change may initiate a chain reaction of changes.

Incremental development models are used for developing huge systems. These systems made up of several sub systems. Incremental model is considered as collection of sub systems. It gives flexibility to customers .one system may be created and customer may start using it .the customer can

earn the lessons and use them in second part of the systems is developed. In an incremental development, the system is developed in different stages, with each stage consisting of

requirements, design, development , and test phases; in each stage a new functionality is added.

Agile Model

Agile gives complete freedom to use to add requirements at any stage of developments. Agile gives complete adaptability to user environment and continuous integration of a product . In Agile Model project is divided in to various sprints. Each sprint contains High-Priority Requirements. The time period for sprint is typically 2-4 weeks. In an Agile mode l, daily screen meetings with team to share status and potential issues. Each sprint is released to customers. Used for critical applications.

Guidelines – it is suggested way of doing things.

Standards – it is mandatory ways of doing things.

Audit – independent assessment of products.

Process Related To Software Quality

Vision - What the organization wishes to achieve in given time.

Mission - Missions are expected to achieve over all vision.

Policy - Policy statement talks about way of doing business defined by senior management.

Goal – Goals defines the mile stones to be achieved.

Objective - Expectation from each mission and can be used to measure the success/failure.

Strategy - Defines way of achieving a particular mission.

Test Strategy - It is a part of test plan describing how testing is carried out for the project and what testing types needs to be performed on the application.

Values – values can be defined as principles. And treating customers with courtesy.

Big Bang Approach (System Testing / Final Testing)

Page 5: Testing Materials for Basic

This testing is last part of software development, ensuring that requirements are defined and documented in requirement specifications and design specifications are met successfully.

Software Application Is Basically Categorize In - Two Types As Projects & Products

Projects - If a software application is developed for a specific customer requirement then it is called a project.

Products - If a software application developed for multiple customer requirements then it is called a product .

Errors - Any incorrect human action that produces a problem in the system is caller error.

Defect - Deviation between expected behaviors to actual behavior of the system is called defect.(Customer Dissatisfaction)

Failure - The deviation identified by end-user while using a system is called a failure.

Mistake - an issue identified while reviewing own documents.

Testing - it is a process of verifying are we developing the right product or not and also validating does the developed product is right or not.

Defined as execution of work product with intent to find a defect. Also detect the differences from Expected Results and Actual Results.

Software testing = Verification + Validation

Verification- It is a process of verifying: Are we developing the right product or not. Known as static testing. It involves reviews ,walk through, inspection, audits.

Validation - It is a process of validating: Does the developed product is right or not. Also called as dynamic testing. It involves system testing, user interface testing, stress testing.

Positive Testing (traditional approach) - Testing conducted on the application in a positive approach to determine what system is supposed to do is called a positive testing. Easy to write test cases.

Page 6: Testing Materials for Basic

Goal Shows system working. Success System Working

Negative Testing (Better approach) - Testing conducted on the application in a positive approach to determine what system is Not supposed to do is called a negative testing. Difficult to write test cases.

Goal find faults. Success Finding System faults

Why does software have defects?

Incorrect requirements Wrong design Poor coding Complex business logic & technology Work pressure Frequently changing requirements.

Most Common Defects

Incorrect functionality Incorrect data edits Poor performance Poor security Incompatibility & Poor User interface Poor usability

Software Testing Techniques:

Static Testing White Box Testing Black Box Testing Grey Box Testing.

White box Testing (Glass box, Structural, Clear box testing)

Testing conducted on the source code by developers to check does the source code is working as expected or not is called white box testing.

Need of white box testing

As the source code is visible, finding and rectifying the problems is easy for developers. The defects that are identified in white box testing are very economical to resolve. To reduce the defects as early as possible white box testing is helpful. To ensure 100% code coverage.

Black box Testing (Dynamic Testing) Testing is conducted on the application by test engineers or by domain experts to check whether the application is working according to customer requirements.

Levels of Dynamic Testing - Component/ Unit Integration System Acceptance

Need of Black Box Testing

White box testing conducted by developer in a technical perception where as black box testing is conducted by test engineers with end-user perception.

Programmers will conduct white box testing in a positive perception whereas tester will conduct black box testing with a negative perception where there is a more chance of finding more defects

White box testing will not cover non functional areas. As functional requirements are also very important for production system those are covered in black box testing.

Page 7: Testing Materials for Basic

The objective of white box testing is 100% coverage whereas the objective of black box testing is 100% customer business requirement coverage.

Black Box Testing = System Testing + User Accepting Testing which is called as requirement Based Testing (or) Specification Based Testing.

Gray box Testing it is a combinations of black and white box testing. This combines verification technique with validation techniques.

Successful / Good Tester - The tester who find more no of defects are considered as good tester

Successful / Good Test case – The test case which finds more no of defects before going to customer.

Test scenario – which address the testing needs and identifies the areas to be tested.

Work bench – It comprises some procedures defined for doing work and some procedure defined to check outcome.

Test policy – which is defined by senior management covering all aspect of testing. It decides frame work of testing and overall mission of achieving customer satisfaction.

Test Strategy / Approach – which defines what steps are required to perform effective testing. how the test environment will be created, what testing tool will be used for testing defect capturing and defect reporting .

Test plan (by test manager ) – Test plan should contain test objectives and test methods applied for defining test scenario, test cases and test data. Test plan tries to answer for 6 questions like when, what, where, why, which & how.

Test Objective – which measures the effectiveness and efficiency of testing process. Defines what testing, will be targeting to achieve. Some of the objectives like code coverage, requirement coverage.

Risk

Risk – Potential loss to user /organization of software when a problematic Situation occurs. (Unplanned Event Occurs).

Product Risk – Potential failure area in the software systems.

Reasons of Risk Facing

Improper use of technology Repetition of errors again and again Inability to transfer the user requirements in to technical requirements. Incorrect data entry In effective security design.

Measurement of Risk

Probability of risk – Indicating that number ranging from 0% to 100% probability of happening of risk.

Impact of risk – Impact of the must be measured in terms of loss to the organization.

High Risk - if probability and impact both are high.

Medium Risk - if probability and impact both are Medium.

Low Risk - if probability and impact both are low.

Severity - Important of defect with respect to functional point of view i.e. how critical is defect with respective to the application.

Priority - It indicates the importance or urgency of fixing a defect.

Risk Formula (RPN / RIN) = Probability of Risk Occur X Impact or loss X Inability of detection

RPN – Risk Prioritization Number RIN- Risk Identification Number

Page 8: Testing Materials for Basic

Testing Types

Early Testing - Conducting testing as soon as possible in development life cycle to find defects at early stages is called early testing. Early testing is helpful to reduce the cost of fixing defects.

Mutation Testing - This is used to check the capability of test program and test cases to find defects.

Risk Based Testing – identifying operations which most likely can cause to failure, these testing these functionalities as priority based.

Skills Needed for Tester

Knowledge about the Testing concepts & levels . Understanding of Verification / validation. Writing of efficient test cases. Knowledge for choosing tools. Knowledge about testing standards. Defining and execution of test plan.

Methods of verification

Self Review – it’s a self check before giving work product. (not referred as official review) Peer Review – author and peer (Tester /Developer) involved.(Informal review). Walkthrough – large team involved along with the author (Semi formal review) Inspection – External people [subject expert] involved (Formal Review) Audits – audits conducted by auditors [may not be subject experts] (formal review)

Coverage in verification

Statement Coverage – may concentrate important part of coding (Lines) Path Coverage – Sequence of control flow from entry to exit in a given code. Decision Coverage - There may be two decisions (for, do-while or ECP, BVA)

Coverage in Verification

Requirement Coverage – Passing / Failure of such test cases can define whether the requirements have been achieved or Not.

Functionality Coverage – Requirements are expressed in terms of functionality required for the application to work successfully.

Feature Coverage - Features are group of Functionalities.

Acceptance Testing

Acceptance testing done by user to understand whether the software satisfies the requirements or not (whether fit for use)

Three Levels of Acceptance

Alpha Testing:  Testing a software product or system conducted at the developer's site. Usually it is performed by the end user.

Beta Testing:  Testing a software product or system conducted at the Customer’s site. Usually it is performed by the end user.

Gamma Testing (When its ready for release)- Testing done by product development organization to understand customer reaction to enhanced / new product. The application given to few people for using it production, environment, and feed back is obtained from them.

V and VV model

Page 9: Testing Materials for Basic

V Model – Validation Model or Test Model

Requirement Phase – there is system and acceptance testing Design phase - there is Interface testing Program Level - there is integration Testing Coding level –validate individual coding (Units)

VV Model – Verification and Validation Model

VV model talks about verification and validation activities associated with each phases of SDLC.

Requirement verification and Requirement validation Design verification and Design validation Coding verification and Coding validation

DEFECT

Defect

Deviation between expected behaviors to actual behavior of the system is called defect. This does not meet customer requirements. Customer Dissatisfaction & Not fitness for use.

Root cause of defect

Requirement no clearly defined. Wrong design. Lack of training in product developing.

Fish bone diagram Technique for route cause of defect.

Defect Report Template(Attributes)

Page 10: Testing Materials for Basic

Defect name Defect Id Defect Name Project Name Module Name Phase Introduced

Phase found Defect Type (functional / security /user

interface defects) Severity Priority.

Page 11: Testing Materials for Basic

Defect Life Cycle

New Open Fixed Fix Verified Closed

Defect fixing (Activities)

Identify Risks Estimate the Probability of impact Minimize the impact by corrections.

SOFTWARE CONFIGURATION MANAGEMENT

Software Configuration Management (SCM)

The process of identifying, organizing and controlling changes to the software during the development and maintenance phase. A methodology t o control and manage a software development project.

Configurable item (Artifacts) – Some items created during life cycle and undergo change and one needs to control and track the changes. (Req.Specifications, Design Documents, Test Cases)

Non Configurable item (Artifacts) – Some items may or may not change during life cycle and no needs to control and no needs to track the changes.(Test results, Minutes of meeting)

SCM Needs:

Multiple people have to work on software that is changing. A project delivering several releases (Builds) Software must run on different machines and operating systems

Sample list of Software configuration items:

Management plans ( Project Plan, Test Plan , etc.,) Specification (Requirements, Design, Test cases, et c., ) Customer documentation (Implementation , Manuals, User Manuals, Online help files) Source code( Java, .Net, PHP, VB, etc.,) Executable code (exe’s) Libraries (Package s, %includes. files, API’S, DLL’S, etc.) Data bases (Data being processed, test data, etc.) Production documentation.

Problems resulting from poor Configuration Management :

Can’t roll back to previous subsystem One changed overwrites a not her Which code change belongs to which version. Faults which were fixed Re-appear. Test worked perfectly on old versions.

Major Activities:

Configuration planning & Step Configuration identification Configuration baseline Change Management Configuration release control Configuration Audit Control of customer property.

Page 12: Testing Materials for Basic

Configuration Management Cycle

Creation of Draft. The artifacts created for the first time or updated from baseline. Review of draft. The created artifacts will be reviewed by all stake holders and may give comments/suggestions. Rework – if necessary, based on review comments rework will be done. Baseline (Finalized comments) – Closure of comments, when all review comments are closed, and then the

artifacts will be called as baselined. Versioning – Based on the comments closed the version number will be assigned.

Versioning Methods

Identify the different version of the same document.

Two common versioning methods, XX.YY.ZZ and XX.YY

XX.YY.ZZ

XX Major Change. YY Minor Change. ZZDraft Version (when an artifact undergo the draft creation, Review, updation)

Example : 00.00.01 (01 review comments and draft versions are keep on changing.)

01 once review comments are closed then change the XX as 01 , if ZZ is 00 Review comments are baselined )

XX.YY

XX Major/Minor Change

YY Draft Version (when an artifact undergo the draft creation, Review, updation)

Example : 00.01 (01 review comments and draft versions are keep on changing.)

01.00 (01 once review comments are closed then change the XX as 01 , if YY is 00 Review comments are baselined )

Configuration Management Tool

Configuration Management may be done either manually or using Configuration Management Tool. The tools are better in managing issues efficiently and effectively than doing it manually.

Current version we worked on Microsoft Visual Source Safe 6

(Visual Source Safe) MS VSS6.0 Navigation:

It is commercial software from Microsoft. Available in two types: i.e., 1) Server, 2) Client Test Engineer responsibilities in configuration management i.e., VSS6. 0

1) Copy files from VSS i.e., check out files from repository 2) Check in files after completion of your work.

Levels of Testing

Proposal Testing - Proposal is made to the customer on the basis of Request for Proposal (RFP) / Request for Information (RFI) / Request for Quotation). Proposal is prepared and reviewed based on customers need.

Requirement Testing - It must be reviewed by Business analyst whether if the requirement will be converted the product or not. (Clarity,Complete,Measurable)

Design Testing - It must be tested, whether if the design is implemented as per the requirement or not. It includes creation of data flow, activity, and state transition diagrams.

Code review - It includes reviewing code files , database schema ,classes, object definitions , procedures & methods.Code must have following clarity,complete,measurable,racable,maintainable.

Page 13: Testing Materials for Basic

Unit Testing(Balck Box) - Unit is the small part of the software system which is testable. (Code Files)

Module testing -Many units come together and form a module. The module may work of its own or may need stub/driver for its execution.

Integration testing (Structural)– Integration of modules makes a system integration. And also testing the interfaces between different modules.

Integration approaches

Bottom up testing – Starts from unit module integration System Testing Top down testing – Ends with unit module integration System Testing Modified top down Approach (Combined bottom up and top down). One after another

1) Starts from unit module integration System Testing2) Ends with unit module integration System Testing

Big Bang Approach (System Testing / Final Testing) - This testing is last part of software development, ensuring that requirements are defined and documented in requirement specifications and design specifications are met successfully.

Sandwich Testing (Combined Top Down and bottom Up )

It includes both features and advantages and it will be done by either simultaneous or one after another.

Critical path first (Main Function) - Testing defines the critical part of the system must be covered first. it is used for where complete system testing is not possible and systems are so large .

Debugging – Code checking to locate the cause of defect.

SPECIAL TESTS

Complexity Testing – It is a verification Technique where the complexity of system design and coding is verified through reviews, walkthroughs or inspection as per planned arrangement.

Control Flow – it is process oriented, it defines direction of control flow as per the decision of system.

Data Flow – Information oriented, it will pass the data from one component to another.

Control Flow Graph

It is a flow control when a program is getting executed. It describes the logical stricture of software unit, where can go while executing instructions.

Page 14: Testing Materials for Basic

If a program have does not any kind of decision, flows happens single direction. Whenever any decision to be taken by a program, there is a possibility of different flow graph.

Each flow graph consists of node (Statements/Expressions) and edges (Transfer of control between nodes).

When decisions are encountered edges e increases than nodes n . And it represents complexity of the code.

Cyclomatic Complexity –This mainly concentrates on number of controls which affects complexity of application. Measures the amount of decision logic in a single software unit. it is defined for each unit to [e – n +2]. Designers must try to reduce complexity as minimum as possible.

Compatibility Testing - Testing technique that validates how well software performs in a particular hardware/software/operating system/network environment. It is performed by the testing teams.

Security Testing - A process to determine that an information system protects data and maintains functionality as intended. It can be performed by testing teams or by specialized security-testing companies.

Vulnerability Testing - There are some weak parts of the system represent the vulnerabilities in the System. These parts of the systems are less protected and represent weaker Parts.

Performance Testing – find whether the system meets is performance requirement under normal condition. Process of measuring various efficiency characteristics of a system such as response time, through put, load stress transactions per minutes transaction mix.

Volume Testing (Load) – it talks about the maximum volume or load a system can accept. (Concurrent users, no of connection, Maximum size of Attachment).

Recovery Testing - Testing technique which evaluates how well a system recovers from crashes, hardware failures, or other catastrophic problems. It is performed by the testing teams.

Stress Testing: Checking the application behavior under stress conditions is called stress testing in other words reducing the system resources and keeping the load as constant checking how application is behaving is called stress testing. (Beyond the limits of its specified requirements)

Installation Testing: checking if we are able to install the software successfully or not as per the guidelines given in installation document.

Un-Installation Testing - Checking if we are able to uninstall the software from the system successfully or not.

Upgradation testing – in order to upgrade it from older version to newer version. During upgradation installer able to identify that older version of same application.

Page 15: Testing Materials for Basic

Regression Testing - After new functionalities added to the existing system or modifications made to the existing system may introduce side-effects. Regression testing is Helpful to identify these side effects.

Error handling Testing - Checking that the application able to detect error and handle (Display) the error message while normal user entering wrong data or selecting wrong option.

Smoke Testing – with help of accessing major functionality, it tells the tester whether the application is alive or not.

Control testing – checks data Validity (Accuracy & completeness), File Integrity (allows authorized user), and Backup and recovery.

Inter System Testing – Checking that the Interfaces between two or more systems and also information is transferred between different systems.

Sanity Testing – Sanity testing performed to test the major behavior of functionality of the Application. Depth of sanity is more than Smoke Testing.

Ad-Hoc Testing( Monkey , Exploratory, Random) - Testing performed without Planning and Documentation.

Monkey Testing - Testing conducted on an application unevenly (Randamly) or zig-zag way with an Intension of finding tricky defects is called monkey testing.

Execution Testing – Checks system achieves desired level of proficiency in production environment or user environment.

Operations Testing – Operation Testing is performed to check that operating procedures are correct as documented and staff can execute the application by using the documentation.

Compliance Testing - Which checks whether the system was developed in accordance with standards, procedures and

guidelines.

Decision table(Axiom) testing – It creates different combination of true / false for all input conditions to making decision

Documentation Testing - when the system is delivered to user , whether product is delivered along with documentations or not (for future maintainance).

Training Testing - checks that users are trained for using the system was developed. Also checks whether training materials are delivered or not.

User Interface Testing - which is performed to check how user-friendly the application. Font colors, screen background,

navigations, etc.,

Page 16: Testing Materials for Basic

Rapid Testing – This finds the biggest bugs in shortest time and provides highest value for money.

Requirement (Specification) Testing - checks system being made will be sufficient the user need. and checks requirements are complete and valid.

Benchmarking testing - Takes best (Perfect) product as reference for evaluate the current product. Based on that difference can improve the current product as the best one.

‘COTS’ Testing (commercial of the shelf) – Software’s readily available in the market and user can buy and use them directly.

Example: a development organization developing software’s for bank, finance institutes,etc.,.it may not be develop automation testing tool for its test requirements.in such case the organization may buy software from outside and use it without investing much time and resource for making such software in house.

Concurrency Testing – checks that whether multiple users can access the application same time or not.

Data Warehouse Testing – it focuses on data storage. And data analyst may perform complex queries and analysis, checks that whether slowing down the operational system or not.

Agile Testing: Software testing practice that follows the principles of the agile manifesto, emphasize testing from the perspective of customers who will utilize the system. (Delivering the working software in faster speed)

Page 17: Testing Materials for Basic

TESTING TOOLS

Guidelines for Selecting tools

Tool must watch its intended purpose. Selecting a tool that is appropriate for a life cycle phase. Matching tool with skills of tester. Affordable cost of tool

Category of tools

Static(Verification) Tools

Code complexity tool for measuring complexity of code. Data profiling tool– to optimizing the database. Code profiling tools – to optimizing code. Test data generator - creating test data Syntax checking tools – verify correctness of code.

Dynamic(validation) Tools

Regression testing tools Defect tracking tool & Communication tool Performance,load,stress testing tools

When to Use Automation Tools

Complexity of test cases No of iterations of testing Test case dependency ( direct effect on next test case).

Test Scripts – created for executing test cases in automated environment.

Boundary value analysis and equivalence partitioning both are test case design strategies in black box testing.

Equivalence Partitioning:

Page 18: Testing Materials for Basic

In this method the input domain data is divided into different equivalence data classes. This method is typically used to reduce the total number of test cases to a finite set of testable test cases, still covering maximum requirements..E.g.: If you are testing for an input box accepting numbers from 1 to 1000 then there is no use in writing thousand test cases for all 1000 valid input numbers plus other test cases for invalid data.Using equivalence partitioning method above test cases can be divided into three sets of input data called as classes. Each test case is a representative of respective class.So in above example we can divide our test cases into three equivalence classes of some valid and invalid inputs.

Test cases for input box accepting numbers between 1 and 1000 using Equivalence Partitioning:1) One input data class with all valid inputs. Pick a any single value from range 1 to 1000 as a valid test case. So one test case for valid input data should be sufficient.2) Input data class with all values below lower limit. I.e. any value below 1, as a invalid input data test case.3) Input data with any value greater than 1000 to represent third invalid input class.

Boundary value analysis:

The input values at the extreme ends of input domain cause more errors in system. More

application errors occur at the boundaries of input domain. ‘Boundary value analysis’ testing technique

is used to identify errors at boundaries rather than finding those exist in center of input domain.

Boundary value analysis is a next part of Equivalence partitioning for designing test cases where test

cases are selected at the edges of the equivalence classes.

Test cases for input box accepting numbers between 1 and 1000 using Boundary value

analysis:

1) Test cases with test data exactly as the input boundaries of input domain i.e. values 1 and 1000 in our

case.

2) Test data with values just below the extreme edges of input domains i.e. values 0 and 999.

3) Test data with values just above the extreme edges of input domain i.e. values 2 and 1001.

Page 19: Testing Materials for Basic
Page 20: Testing Materials for Basic

Sample test scenario and Test cases for login Screen

Positive test cases.

Enter valid username and password.

1. Click on forgot password link and retrieve the password for the username.

2. Click on register link and fill out the form and register username and password.

3. Use enter button after typing correct username and password.

4. Use tab to navigate from username textbox to password textbox and then to login button.

Negative test cases

1. Enter valid username and invalid password.2. Enter valid password but invalid username.

3. Keep both fields blank and hit enter or click login button.

4. Keep username blank and enter password.

5. Keep password blank and enter username.

6. Enter username and password wrong

Page 21: Testing Materials for Basic

Test Scenario

Test Scenario (Positive) functional viewTest Scenario Expected ResultUser with a valid username and password should be allowed to login.

User should be displayed the email screen once he submits the login credentials.

 Test Scenario (Negative) Security and Functional viewTest Scenario Expected ResultUser with an invalid username and password should not be allowed to login.

User should be displayed an error message for invalid username and password.

Sample Test case Template

Fuzz testing – It is a black box testing technique which uses a random bad data to attack a program to check if anything breaks in the application.

Scalability Testing - It is used to check whether the functionality and performance of a system, whether system is capable to meet the volume and size changes as per the requirements.

Date Driven Testing - It is Automation testing process in which application is tested with multiple set of data with different preconditions as an input to the script.

Interface testing - It is done to check whether the individual modules are communicating properly as per specifications.

Test Harness – It is configuring (construct) a set of tools and test data to test an application in various conditions.

Entry Criteria and Exit Criteria Software Testing

Entry Criteria - SRS , Use Case , Test Case, Test Plan Exit criteria - Test Summary Report, Metrics, and Defect Analysis Report.

Globalization Testing - Process of verifying software whether it can be run independent of its geographical and cultural environment. Ex: Checking if the application is having features of setting and changing language, date, format and currency if it is designed for global users.

Localization Testing - Verifying of globalized application for a particular locality of users, cultural and geographical conditions.

Compatibility Testing -Checking if the application is compatible to different software and hardware environment or not .

Page 22: Testing Materials for Basic

Test Case - A set of preconditions steps to be followed with input data and expected behavior to validate a functionality of a system.

Good Test Case - Test cases that have high priority of catching defects in called a good test case.

Use Case Testing - Validating software to confirm whether it is developed as per the use cases or not.

Defect Age - The time gap between date of detection & date of closure of a defect.

Bucket Testing –data and elements are together that are grouped together within a bucket. Mostly used to study the impact of the various product designs in website metrics.

User Acceptance Testing (UAT) - Testing of computer system by client to verify if it adhered to the provided requirements.

Code Walk Through - Informal analysis of the program source code to find defects and verify coding techniques.

Early Testing - Conducting testing as soon as possible in development life cycle to find defects at early stages of SDLC.

Exhaustive Testing - Testing functionality with all valid, invalid inputs and preconditions is called exhaustive testing.

Defect Clustering - small module or functionality may contain more number of defects – concentrate more testing on these functionality.

Pesticide Paradox - If prepared test cases are not finding defects, add/revise test cases to find more defects.

Static Testing - verification of the code without executing the program is called as static testing.

End-to-End Testing - Testing the overall functionality of the system including the data integration among all the modules.

Exploratory Testing - Exploring the application, understanding the functionality, adding (or) modifying existing test cases for better testing .

Usability Testing - Checking how easily the end users are able to understand and operate the application.

Software Testing Life Cycle (STLC) - Write Test Plan ,Test Scenarios, Test Cases ,Executing Test Cases,Test Results, Defect Reporting, Defect Tracking , Defect Closing , Test Release.

CMMI - Capability Maturity Model Integration.