TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan...

31
TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500 Software Engineering Project Author: Jesse Heininen Printed: 13.2.2012 21:20:00 Distribution: Tero Ahtee Group members: Chandrasekar Kannaiah Abu Shumon Advisers Marko Leppänen Marie-Elise Kontro [email protected] [email protected] [email protected] [email protected] [email protected] Document status: Final Modified: 13.2.2012 22:20:00

Transcript of TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan...

Page 1: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

TAMPERE UNIVERSITY OF TECHNOLOGY

Test PlanGroup 14

WRM Data Visualization ScreensVersion 1.2

TUT Dept of Software Systems OHJ-3500 Software Engineering Project

Author: Jesse Heininen Printed: 13.2.2012 21:20:00

Distribution:Tero Ahtee

Group members:

Chandrasekar KannaiahAbu Shumon

Advisers

Marko LeppänenMarie-Elise Kontro

[email protected]

[email protected]@tut.fi

[email protected]@tut.fi

Document status: Final Modified: 13.2.2012 22:20:00

Page 2: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

VERSION HISTORY

Version Date Author Description

1.0 01.12.2011 Heininen, Shumon, Kannaiah

First draft, test cases, criterias

1.1 02.12.2011 Shumon Updating and merging

1.2 09.02.2012 Shumon Final updates.

Modified: 13.02.2012 22:20 2/24

Page 3: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

TABLE OF CONTENTS

INTRODUCTION..............................................................................................................................................6

1.1 PURPOSE AND SCOPE...................................................................................................................................6

1.2 PRODUCT AND ENVIRONMENT....................................................................................................................6

1.3 DEFINITIONS, ACRONYMS AND ABBREVIATIONS........................................................................................7

1.4 REFERENCES................................................................................................................................................8

1.5 OVERVIEW...................................................................................................................................................9

ENVIRONMENT RESTRICTIONS.............................................................................................................11

1.6 HARDWARE..............................................................................................................................................11

1.7 SOFTWARE................................................................................................................................................11

1.8 SECURITY.................................................................................................................................................11

1.9 TOOLS AND DATA ....................................................................................................................................11

STAFFING REQUIREMENTS.....................................................................................................................12

1.10 PEOPLE..................................................................................................................................................12

1.11 TRAINING AND REQUIRED SKILLS ..........................................................................................................12

RESPONSIBILITIES......................................................................................................................................13

1.12 INTEGRATIONS TEST GROUP...................................................................................................................13

1.13 SYSTEM TEST GROUP..............................................................................................................................13

1.14 USER INTERFACE TEST GROUP ...............................................................................................................13

REQUIRED OUTPUTS..................................................................................................................................14

SPECIALITIES................................................................................................................................................15

1.15 FUNCTIONALITY NOT TO BE TESTED.......................................................................................................15

ORDER AND METHODS OF TESTING TASKS........................................................................................16

1.16 ORDER OF TASKS....................................................................................................................................16

1.17 COMPONENTS TO BE TESTED .................................................................................................................16

1.18 TEST CASE CLASSES ...............................................................................................................................16

1.19 METHODS AND TECHNIQUES..................................................................................................................16

1.20 COVERAGE .............................................................................................................................................17

1.21 RESTRICTIONS.........................................................................................................................................17

TEST CASES...................................................................................................................................................18

1.22 DATABASE TESTING ..............................................................................................................................18

Database query test - Medium...................................................................................................................18

Modified: 13.02.2012 22:20 3/24

Page 4: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

Database generator test – Low..................................................................................................................18

1.23 RELATED SOFTWARE TESTING...............................................................................................................18

Vaadin compatibility test – Critical..........................................................................................................18

Browser compatibility test – Critical.........................................................................................................18

Java Runtime Library Test – Medium........................................................................................................18

1.24 USER INTERFACE TESTING.....................................................................................................................18

Enterprise menu – Low..............................................................................................................................18

Assets view – Medium................................................................................................................................18

Equipment view – Critical.........................................................................................................................19

Data view – Medium..................................................................................................................................19

1.25 INTERFACE TESTING..............................................................................................................................19

Co-operating with client application – Critical........................................................................................19

1.26 PRINTING TESTING.................................................................................................................................19

Basic page print test – Low........................................................................................................................19

1.27 SECURITY TESTING................................................................................................................................19

1.28 RECOVERY TESTING...............................................................................................................................19

1.29 PERFORMANCE TESTING .......................................................................................................................19

One equipment – Much traffic – Test - Critical.........................................................................................19

Several assets – Much traffic, some alarms – Test - Critical....................................................................19

Many assets – Much traffic, much alarms – Test – Medium.....................................................................20

1.30 REGRESSION TESTING............................................................................................................................20

Benchmark tests- Critical..........................................................................................................................20

1.31 INSTALLATION AND UNINSTALLATION .................................................................................................20

1.32 USABILITY TESTING................................................................................................................................20

Moving between views – Low....................................................................................................................20

Getting data from longer time interval – Low...........................................................................................20

1.33 SPECIAL CASES .....................................................................................................................................20

Warning levels – Medium..........................................................................................................................20

CRITERIA AND REQUIREMENTS ...........................................................................................................21

1.34 ACCEPTANCE.........................................................................................................................................21

1.35 REJECTION..............................................................................................................................................21

1.36 REQUIREMENTS OF INTERRUPTING OF TESTING.....................................................................................21

1.37 REQUIREMENTS OF CONTINUING TESTING ..............................................................................................21

1.38 REQUIREMENTS FOR QUITTING TESTING.................................................................................................21

1.39 REQUIREMENTS OF REJECTING CODE .....................................................................................................21

RISK MANAGEMENT..................................................................................................................................22

SCHEDULE AND TASKS..............................................................................................................................23

Modified: 13.02.2012 22:20 4/24

Page 5: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

ACCEPTANCE ...............................................................................................................................................24

1.40 ANALYSIS ABOUT TEST ...........................................................................................................................24

1.41 ACCEPTANCE OF TEST CASES .................................................................................................................24

1.42 ACCEPTANCE OF TOTAL TESTING............................................................................................................24

Modified: 13.02.2012 22:20 5/24

Page 6: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

INTRODUCTION

1.1 Purpose and scope

This test plan document contains the complete testing plan for the WRM Data Visualization Screens and describes the test cases to be implemented in the system. It provides the visibility in the design and provides information needed for software support.

The project aims to create UI component for an existing application of the client, Wapice Oy. This component will be used to replace the UI of the existing application.

Diagram 1.1 shows the hierarchical view.

As depicted in the above diagram, the application will have three views namely, main view, enterprise view and asset view. The views are explained in chapter 2 details.

1.2 Product and environment

User Interface is add-on component for an existing web application framework. Data, which is going to be shown, is produced by terminal [ref: table 1.1 & section 1.4] de-vices which have different amount and types of sensors. Terminals send measurement data to database where it can be read by a web application. Since it is difficult to read

Modified: 13.02.2012 22:20 6/24

Page 7: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

and understand complex data for normal people, the client decides to have more un-derstandable view and easy to use interface.

Component is internet web application coded with Java EE using Vaadin framework with SVG. Database is SQL-based relational database. All data is ready and the project won't require anything to do with existing components like database or termi-nals.

1.3 Definitions, acronyms and abbreviations

Asset Subject in observation, ex: car is an asset that is being observed by a car renting enterprise.

Client Wapice Oy.

Component Unit of the existing application. In this document, component gen-erally refers the UI component of the existing application that the team develops.

Database Database contains all data from all Enterprises and devices.

DataGenerator

Tools that is used to generate test data in the database.

Data point A modifiable position in the asset image, which is mapped by a terminal

Eclipse Code editor with Vaadin support

Enterprise Collection of certain types of assets. User can have different amount of enterprises and all of them could include different amount of certain types Assets.

End user User of the application

Equipment Sensor, a device contains wireless connection with what they send data to database. All Assets contains certain number of different types of sensors like temperature, pressure, velocity, positioning, etc.

IDE Integrated development interface

Inspection Inspection of document and code

Java EE Java Enterprise Edition (Programming language)

Message box An information box which contains only a message in text format. Mostly in error reporting for example “Can’t connect to database!” or “No data in selected time interval!”

MySQL Query language for database.

OS Operating system

Sensor Measuring device, this saves measurement data in it and send

Modified: 13.02.2012 22:20 7/24

Page 8: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

those data to terminal.

SVG Scalable Vector Graphics

SVN Version control tool

SQL Relational database system

SRS Requirements of system

Terminal A sensor in the asset that is being used to monitor one of the prop-erties of an asset

Test database Database that will be used for development purpose

UI User Interface

Vaadin Web development framework which uses Java

Visualization screen

Screens that shows the application data

Table 1.1 Definitions and explanations

1.4 References

Wapice OyWapice Oy is a Finnish software company working with lead-ing industrial manufacturing and energy companies.http://www.wapice.com 30.11.2011

Java EEJava Platform, Enterprise Edition (Java EE) is the industry standard for enterprise Java computing. Utilize the new, light-weight Java EE Web Profile to create next-generation web ap-plications, and the full power of the Java EE 6 platform for en-terprise applicationshttp://java.sun.com/j2ee/ 30.11.2011

JUnitJUnit is a unit testing framework for the Java programming language. JUnit has been important in the development of test-driven development, and is one of a family of unit testing frameworks collectively known as xUnit that originated with SUnit.http://en.wikipedia.org/wiki/JUnit 30.11.2011

Vaadin FrameworkVaadin is a web application framework for Rich Internet Ap-plications (RIA). In contrast to JavaScript libraries and browser-plugin based solutions, it features robust server-side architecture. This means that the largest part of the application logic runs securely on the server.http://java.sun.com/j2ee/

30.11.2011

Modified: 13.02.2012 22:20 8/24

Page 9: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

SVGScalable Vector Graphics (SVG) is a family of specifications of an XML-based file format for describing two-dimensional vector graphics, both static and dynamic (i.e. interactive or an-imated). SVG images and their behaviors are defined in XML text files. This means that they can be searched, indexed, scripted and, if required, compressed.http://en.wikipedia.org/wiki/Scalable_Vector_Graphics

30.11.2011 SVN

Subversion (often abbreviated SVN, after the command name svn) is a software versioning and a revision control system distributed under a free license. Developers use Subversion to maintain current and historical versions of files such as source code, web pages, and documentation.http://en.wikipedia.org/wiki/Apache_Subversion

30.11.2011

1.5 Overview

Chapter 1This chapter gives the scope of the document and defines the product and terms used.

Chapter 2This chapter describes what kind of environmental restrictions are in the project.

Chapter 3This chapter describes the staffing requirement of the project.

Chapter 4This chapter defines the responsibilities of the working group members.

Chapter 5This chapter specifies the output of the product.

Chapter 6This chapter describes what specialties are in the project.

Chapter 7This chapter specifies the order and different kind methods of testing.

Chapter 8This chapter describes details about the different sort of test cases of the project.

Chapter 9This chapter specifies the all criteria and requirements of the project Chapter 10This chapter describes what the risk management of testing is.Chapter 11This chapter defines the tasks and schedules of the project.Chapter 12This chapter specifies the analysis about testing.

Modified: 13.02.2012 22:20 9/24

Page 10: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

Modified: 13.02.2012 22:20 10/24

Page 11: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

ENVIRONMENT RESTRICTIONS

1.6 Hardware

Wapice Oy has already current servers which can run this application and they are re-sponsible for administrating the system.

1.7 Software

Group is not awarded in which environment is Wapice using their servers but that is not necessary information for development because of Java has no operating system requirements. Also servers and databases are not bind for any special operating sys-tem.

Development environment is Windows based operating system (Vista or Windows 7) with Eclipse IDE which includes Java EE compiler and Vaadin plugin. Eclipse keeps version control of compiler and plugin so group is al-ways using newest versions.

For testing group is using Apache Tomcat server and MySQL database. Other tools which are used are Visual C# Expression Edition for making data generator tool for implementing test data to database.

Testing environments are all common browsers (Internet Explorer, Mozilla Firefox, Apple Safari and Google Chrome) with Java SDK support.

Browser/platform Firefox IE Safari Chrome Priority or-der

Windows X (1) X (1) X X (2) 1: highest2: higher3: medium4: low

Linux X (4) - X (4) -Unix X (2) - - X (3)

1.8 Security

The UI component is not responsible for security and recovery. This should be han-dled by the application. The UI component is not taking security issue in concern.

1.9 Tools and data

Project group are using WRM-data generator [ref: table 1.1], Firefox bugzilla, eclipse JUnit [ref: section 1.4] for different kind of testing and data manipulation.

Modified: 13.02.2012 22:20 11/24

Page 12: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

STAFFING REQUIREMENTS

1.10 People

We are three people in the team. Following are the responsibilities of each of them. Project manager is monitoring overall situation, and if needed he will come with test-ing support.

Name Role

Abu Shumon project manager / developer / tester

Chandrasekar Kannaiah developer / tester

Jesse Heininen contact person/ developer/ tester

1.11 Training and required skills

Basically the product is fairly simple, so no prior training or special skill is required. So project testers do not require any special training or skill development session be-fore go through testing phase.

Modified: 13.02.2012 22:20 12/24

Page 13: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

RESPONSIBILITIES

1.12 Integrations test group

Group size: 3

Shumon Kannaiah HeininenTest case 1 XTest case 2 XTest case 3 XFinal wrap-up X

Responsible: Chandrasekar.

1.13 System test group

Group size: 3

Shumon Kannaiah HeininenTest case 1 XTest case 2 XTest case 3 XFinal wrap-up X

1.14 User interface test group

Group size: 3

Shumon Kannaiah HeininenTest case 1 XTest case 2 XTest case 3 XFinal wrap-up X

Modified: 13.02.2012 22:20 13/24

Page 14: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

REQUIRED OUTPUTS

Required outputs differ in all test runs because of data generator. It can be possible to use certain base-numbers in random serials so it is known to group which values are generated and this method is probably best.

All images that product is going to be shown must show correctly and all things that are included to images like data points positions and values.

All enterprises and assets should be shown correctly in views those are needed. Cor-rect information is readable from straight database.

Modified: 13.02.2012 22:20 14/24

Page 15: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

SPECIALITIES

1.15 Functionality not to be tested

Database testing is restricted by client because they handle all communication with it. Groups only part of testing data coming from database. In testing phase group is using test data generator which functionality should be tested.

There is no need to test connection between host application and database because that is done by client.

Everything concerning host application or its safety regulations or recovery plans is not part of this test plan. Those are all covered by client.

Uninstallation is not to be tested because this product is only component of larger ap-plication and is implemented once.

Modified: 13.02.2012 22:20 15/24

Page 16: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

ORDER AND METHODS OF TESTING TASKS

1.16 Order of tasks

Testers should start with unit testing and go through every single function in project. After that, testers can concentrate on larger collections and system testing. Project in-cludes more user interface and unit testing than actual system testing.

Tests have been divided in three groups by severity of error. Those are critical, medium and low. Basically when testers are going through plan different functionality they go also from most critical to least one.

The order of the testing will be followed according to the section numbers in chapter 8, taking the highest priority task first, and then the following.

1.17 Components to be tested

Product includes four different types of views and every view includes different types of functions. Bigger components to be tested are especially svgGenerator and drawSvg.

1.18 Test case classes

EnterpriseViewIncludes list of different type of enterprises.Priority: medium

AssetsViewIncludes list of miniatyre images of different type of assets.Priority: medium

EquipmentViewBigger image about one asset with it current data.Priority: highest

DataViewMore specific view of data from certain time interval.Priority: highest

1.19 Methods and techniques

Tests are going to be run from critical -> medium -> low. In chapter 9 is described how group can interrupt testing if results are not clear or some errors prevent further testing. For regression testing there can be situations where all tests have to be run again if testing is interrupted. And because all code and database information is avail-able this can be called White-Box testing.

Modified: 13.02.2012 22:20 16/24

Page 17: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

1.20 Coverage

Tests should cover all basic usage problems and error possibilities. Most of possible errors are classified as medium or low, so test groups main concern is all critical er-rors.

In module testing phase testers concentrate on singular modules in code and by differ-ent types of inputs they should get right outputs. This phase needs most time with code and is suggested to run by good coder.

In integration testing phase testers test with clients testers how the system is working in clients application. There can be interface problems and some requirements, add-ons or plugin –based problems.

System testing is made to whole system when it has been successfully integrated. In-cludes most real life test cases and test data should be real, no more generated data.

Acceptance testing in made by client.

1.21 Restrictions

This test plan covers mainly usage with web browser usage in most common browsers in most common user interfaces. There are not tests according to mobile phone or equipment compatibility testing.

Application doesn’t have any restrictions regarding client’s place or hardware.

Modified: 13.02.2012 22:20 17/24

Page 18: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

TEST CASES

1.22 Database testing

Database query test - Medium

In this phase is confirmed that all queries to database returns correct answer. Test is driven by using software and confirming the outcome using straight to database in-serted query. Those answers should be exactly same.

Database generator test – Low

Test data generator should be tested that it is working properly. It should update all sensor values once a second. Could be tested by following data vies in MySQL Work-bench.

1.23 Related software testing

Vaadin compatibility test – Critical

Test should cover all Vaadin and SVG classes test so it is sure that those components are working correctly. After this test is successfully passed is safe to continue testing.

Browser compatibility test – Critical

All supported browsers must be tested by all test cases. If possible, test group can use previous and older versions also and determine which is the oldest version of certain browsers to safe to use. Test group can use prioritization of which combinations (oper-ating system/browser) are the most important ones.

Java Runtime Library Test – Medium

There should be test which contains few different versions of Java Runtime library and group can decide which minimum requirement for this component is. There can still be some differentiations by clients host application.

1.24 User interface testing

Enterprise menu – Low

Menu view should contain all different types of enterprises with their picture on it and text. Actual ones can be searched from database.

Assets view – Medium

In asset view should contain all different types of assets in miniature images and those information should be updating to this view also.

Modified: 13.02.2012 22:20 18/24

Page 19: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

Equipment view – Critical

View should show a certain asset information on svg image and sensor values in data-points pointed somewhere over the picture. Updating should be correct.

Data view – Medium

View should contain data grid view from certain time interval and could also draw a graph about data. Confirming right values can be made by sample from database and ex. PowerPoint Graph.

1.25 Interface testing

Co-operating with client application – Critical

How clients application works with our component and vice versa. There is two-way communication for customer ids and query results coming in and qeury questions go-ing out.

1.26 Printing testing

Basic page print test – Low

Group can take prints from all different type of views and compare those to actual im-age. All layer settings and placement to paper is not defined in requirements.

1.27 Security testing

Client is responsible

1.28 Recovery testing

Client is responsible

1.29 Performance testing

One equipment – Much traffic – Test - Critical

Test is going to run in Equipment View and refresh modifier should be set as close to real time update as possible and modified data generator work as fast as possible so it is seen how the updating view is stressing the system. When generator is stopped the view should work normally afterward also. Test run should be amount of ~10000 rows of data

Several assets – Much traffic, some alarms – Test - Critical

Test in run in Assets View and refresh rate may differ from 1 second to 5 seconds. Generator should be on default values. In this phase it is seen how this view is going to be work in normal usage. Test run should contain about ~5000 rows of data.

Modified: 13.02.2012 22:20 19/24

Page 20: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

Many assets – Much traffic, much alarms – Test – Medium

Set in database and generator to work on great amount of assets and refresh rate should be as close real time as possible. Test run should be about ~3000 rows of data.

1.30 Regression testing

Benchmark tests- Critical

When has been made some error fixes, all performance tests should be re-driven. If critical error was about database, user interface or interfaces with client application, then those areas should be tested again.

1.31 Installation and uninstallation

These test are covered in interface testing in 8.4

1.32 Usability testing

Moving between views – Low

How user can move through different view to another. Is all the data same in views and what if something is added while user was in different view.

Getting data from longer time interval – Low

How user can choose how much data he/she wants to examine in data view and may there be any problems. This may occur also to client’s way to handle database queries.

1.33 Special cases

Warning levels – Medium

When equipment raises an alarm it should show different type of warning sign etc. in Assets view and Equipment view. These must be tested in different cases and check how long the alarm may be raised before disabling.

Modified: 13.02.2012 22:20 20/24

Page 21: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

CRITERIA AND REQUIREMENTS

1.34 Acceptance

Test is confirmed as accepted if in 1) critical, all tests are passed 2) medium, 90% of test are passed 3) low, 75% test are passed

1.35 Rejection

If any critical is failed, over 10% of medium or over 25% of low prioritized error are failed.

1.36 Requirements of interrupting of testing

Testing will be interrupted if some error or combination of error prevents continuing of testing or the results of further testing are not clearly readable. In some cases error may lead to other errors. Group can decide to interrupt testing for fixing already present errors. In almost any case on critical error, it is good to inform actual code group.

1.37 Requirements of continuing testing

If code group have fixed errors which could made problems for rest of test cases can test group decide to continue testing. In this phase it is good to drive some already passed tests which outcome could have been changed from previous test run.

1.38 Requirements for quitting testing

Testing is finished when all required features and test cases have been tested and con-firmed as successful. Whole test project will be successful if points of 9.1 have cov-ered all test cases.

If most of test cases have been interrupted by errors, whole test case can be ended and restart when heavy changes have been made.

Deadline for quitting testing is 19 January, 2012. Hopefully team will perform all the test cases within that time.

1.39 Requirements of rejecting code

If critical errors are impossible to fix in way the code is written at moment, the whole code part should be rejected and start again. Unit testers may have to change their method then.

Modified: 13.02.2012 22:20 21/24

Page 22: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

RISK MANAGEMENT

The main risk as far testing is concerned is the failure of critical test cases. In order to avoid any surprise towards the end of the project, the team schedules smoke testing of critical test cases at the end of every feature completion, bug fix or on any major code change.

Another critical risk is failure of dependent test cases. Sort the test cases based on the number of dependencies in decreasing order and execute them from the top.

A minor risk is unavailability of test systems, for instance unavailability of Mac sys-tem will delay testing the application in safari browser. The team should schedule test-ing flexibly, such that the team should be ready to do safari related testing as and when the Mac system is available on loan.

Modified: 13.02.2012 22:20 22/24

Page 23: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

SCHEDULE AND TASKS

Testers should start with unit testing and go through every single function in project. After that, testers can concentrate on larger collections and system testing. Project in-cludes more user interface and unit -testing than actual system testing.

Tests have been divided in three groups by severity of error. Those are critical, medium and low. The order of execution is critical, medium and low.

Apart from the above, the team should be agile enough to do testing based on the availability of hardware and operating system, like Apple MacBook for testing the ap-plication in Safari.

Case TimelineDatabase testing Related software testingUser interface testingInterface testing

Week 51, 52; December 2011

Printing testingSecurity testingRecovery testingPerformance testing

Week 1; January 2012

Regression testingInstallation and uninstallation Usability testingSpecial cases

Week 2; January 2012

Modified: 13.02.2012 22:20 23/24

Page 24: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Plan – Group 14 Version 1.2

ACCEPTANCE

1.40 Analysis about test

The overall testing is analyzed by the test results and the coverage results. The team always tries to get close to 100% pass rate and 100% coverage thought it is practically not easy to achieve.

1.41 Acceptance of test cases

The section 9 clearly states the acceptance and rejection criteria’s of the test cases. Fi-nally after every test case, main responsible of that specific team will either accept or reject.

1.42 Acceptance of total testing

The total testing will be accepted only if the test results meet the acceptance criteria defined in section 9.1. In addition, there should not be any visible crashes in the appli-cation thought there is no relevant test case mentioned in the list of test cases. And if everything goes perfect, then client will be the final accepter.

Modified: 13.02.2012 22:20 24/24

Page 25: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

TAMPERE UNIVERSITY OF TECHNOLOGY

Test ReportGroup 14

WRM Data Visualization ScreensVersion 1.1

TUT Dept of Software Systems OHJ-3500 Software Engineering Project

Author: Jesse Heininen Printed: 9.3.2012 15:37:00

Distribution: Tero Ahtee

Group members

Jesse Heinenen

Chandrasekar Kannaiah

Abu Shumon

Advisers

Marko LeppänenMarie-Elise Kontro

[email protected]

[email protected]@[email protected]

[email protected]@tut.fi

Document status: Final Modified: 9.3.2012 16:37:00

Modified: 09.03.2012 16:37 1/7

Page 26: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Report – Group 14 Version 1.1

VERSION HISTORY

Version Date Author Changes1.0 08.02.2012 Jesse, Shumon,

ChandrasekarInitial report

1.1 10.02.2012 Jesse, Shumon,Chandrasekar

Test results update

Modified: 09.03.2012 16:37 2/7

Page 27: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Report – Group 14 Version 1.1

TABLE OF CONTENTS

VERSION HISTORY..........................................................................................................................2

TABLE OF CONTENTS....................................................................................................................3

1. INTRODUCTION.............................................................................................................................4

2. VARIANCES FROM PLAN............................................................................................................4

3. COMPREHENSIVENESS ASSESSMENT...................................................................................4

4. SUMMARY OF RESULTS.............................................................................................................4

5. TEST LOG........................................................................................................................................4

6. EVALUATION..................................................................................................................................5

7. APPROVALS....................................................................................................................................5

APPENDIX 1: ERROR REPORTS...................................................................................................5

ERROR#1 SEVERAL ASSETS, ASSET VIEW.........................................................................................5

ERROR#2 SEVERAL ASSETS, ENTERPRISE VIEW................................................................................6

Modified: 09.03.2012 16:37 3/7

Page 28: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Report – Group 14 Version 1.1

1. INTRODUCTION

This document contains the information about the test report of WRM data visu-alization screen project. The test plan document is the source of this report. This report covers the complete test cases and their results. In addition, the report gives the information about the changes of test plan, evaluation of testing and the error report.

2. VARIANCES FROM PLAN

There are no major changes from the test plan. As the plan stays intact, the team has decided to leave testing of lower priority browsers. The testing has been car-ried out only in the higher priority browsers based on the priorities defined in the test plan. According to that, the browsers that were used under test were, Fire-fox, internet explorer and safari in order.

3. COMPREHENSIVENESS ASSESSMENT

The results of the test phase have given good input for the incremental develop-ment. Many of the critical errors have been resolved during the end of the pro-ject. The remaining errors have been listed in the error report section. All of these errors are less than medium level errors.

4. SUMMARY OF RESULTSAll critical and major test cases are passed in the test run. There are only two me-dium level test cases that fail due to the unavailability of the asset image. The team is expected to fix that soon. In addition, few test cases are left untested as those are optional features. The only one major test case that left untested was, ' Co-operating with client application'. The team assumes client will execute this test case.

5. TEST LOG

Name Executed on

Result Remarks

1 Database query test 14.01.2012 Passed none2 Database generator

test14.01.2012 Passed none

3 Vaadin compatibil-ity test

14.01.2012 Passed none

4 Browser compati-bility test

08.02.2012 passed Only Firefox, IE and safari are tested

5 Java Runtime Li- 04.02.2012 Passed none

Modified: 09.03.2012 16:37 4/7

Page 29: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Report – Group 14 Version 1.1

brary Test6 Enterprise menu 05.02.2012 Passed none7 Assets view 05.02.2012 Passed the optional tree view

is not available8 Equipment view 05.02.2012 Passed none9 Data view not applic-

ableNot applic-able

optional feature

10 Co-operating with client application

not applic-able

not tested client will do that

11 Basic page print test

not applic-able

Not applic-able

optional feature

12 One equipment – Much traffic

08.02.2012 passed none

13 Several assets, as-set view

09.02.2012 failed none

14 several assets, en-terprise view

09.02.2012 failed none

15 Benchmark tests not applic-able

not run none

16 Moving between views

08.02.2012 passed none

17 Getting data from longer time interval

08.02.2012 passed none

18 Warning levels 08.02.2012 passed none

6. EVALUATION

As mentioned in the test plan, test is confirmed as accepted if in 1) critical, all tests are passed 2) medium, 90% of test are passed 3) low, 75% test are passed. So, based on the results listed above the team accepts this test run.

7. APPROVALS

The test results were reviewed internally within the team and it is accepted based on the criteria's mentioned in the test plan. The customer is yet to approve the report.

APPENDIX 1: ERROR REPORTS

ERROR#1 Several assets, Asset view

Date and time 09.02.2012, 10:25

Modified: 09.03.2012 16:37 5/7

Page 30: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Report – Group 14 Version 1.1

Description When there are more than one asset, on clicking different asset the user should be navigated to the asset view with the corres-ponding asset image displayed

Test case used several assets, asset viewInputs Mouse click in enterprise viewExpected results When there are more than one asset, on clicking different asset

the user should be navigated to the asset view with the corres-ponding asset image displayed

Actual results Only car image is working, no other image is availableOther anomalies / observations

looks like the image is missing from the input directory

Severity, con-sequences

Medium

Test environment Local working environmentCan the incident be repeated

yes

Analysis on potential causes of the incid-ent

looks like the image is missing from the input directory

Analysis on how the incident affected testing

The testing of different asset images was blocked

ERROR#2 Several assets, Enterprise view

Date and time 09.02.2012, 10:25Description When there are more than one asset, the enterprise view

should have all the images displayed there. Test case used several assets, enterprise viewInputs Mouse click in main view

Modified: 09.03.2012 16:37 6/7

Page 31: TAMPERE UNIVERSITY OF TECHNOLOGY - Abu Shumon€¦ · TAMPERE UNIVERSITY OF TECHNOLOGY Test Plan Group 14 WRM Data Visualization Screens Version 1.2 TUT Dept of Software Systems OHJ-3500

WRM Data Visualization Screens Test Report – Group 14 Version 1.1

Expected results When there are more than one asset, the enterprise view should have all the images displayed there. The images should be unique

Actual results Only car image is working, no other image is availableOther anomalies / observations

looks like except car image others are missing from the input directory

Severity, con-sequences

Medium

Test environment Local working environmentCan the incident be repeated

yes

Analysis on potential causes of the incid-ent

looks like except car image others are missing from the input directory

Analysis on how the incident affected testing

The testing of different asset images was blocked

Modified: 09.03.2012 16:37 7/7