D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28...

86
eDIANA Embedded Systems for Energy Efficient Buildings Grant agreement no.: 100012 Dissemination level X PU = Public PP = Restricted to other programme participants (including the JU) RE = Restricted to a group specified by the consortium (including the JU) CO = Confidential, only for members of the consortium (including the JU) D7.1-A Demo Lab Testing Methodology Author(s): José Manuel Marcos FAGOR Aitor Arriola IKERLAN Leire Etxeberria MU Maider Azanza MU Goiuria Sagardui MU Cengiz Gezer UNIBO Toon Van Craenendonck PCL Gerardo H. Glorioso ACCIONA Luca Gherardoni ED Naia Arana TECNALIA Adrián Noguero TECNALIA Riccardo Ukmar ST M J Martínez de Lizarduy I&IMS Issue Date May 2011 (m28) Deliverable Number D7.1-A WP Number WP7: Pre-Industrial demonstration Status Delivered

Transcript of D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28...

Page 1: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

eDIANA Embedded Systems for Energy Efficient Buildings

Grant agreement no.: 100012

Dissemination level

X PU = Public

PP = Restricted to other programme participants (including the JU)

RE = Restricted to a group specified by the consortium (including the JU)

CO = Confidential, only for members of the consortium (including the JU)

D7.1-A Demo Lab Testing Methodology

Author(s): José Manuel Marcos FAGOR Aitor Arriola IKERLAN Leire Etxeberria MU Maider Azanza MU Goiuria Sagardui MU Cengiz Gezer UNIBO Toon Van Craenendonck PCL Gerardo H. Glorioso ACCIONA Luca Gherardoni ED Naia Arana TECNALIA Adrián Noguero TECNALIA Riccardo Ukmar ST M J Martínez de Lizarduy I&IMS

Issue Date May 2011 (m28)

Deliverable Number D7.1-A

WP Number WP7: Pre-Industrial demonstration

Status Delivered

Page 2: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 2

Disclaimer The information in this document is provided as is and no guarantee or warranty is given that the information is fit for any particular purpose. The user thereof uses the information at its sole risk and liability. The document reflects only the author‟s views and the Community is not liable for any use that may be made of the information contained therein.

Document history

V Date Author Description

0.1 2011-02-24 FAGOR Draft ToC 0.2 2011-03-24 MU First draft 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06 PCL -

ACCIONA Contributions to User Interface testing Methodology

0.4 2011-05-11 FAGOR Revision of test methodology strategies 0.5 2011-05-16 TECNALIA Input on WP6 technology 0.6 2011-05-19 TECNALIA Contribution to Chapter 5 0.7 2011-05-20 ED

FAGOR Contributions to Chapter 2 Summary and Introduction chapters

0.8 2011-05-23 ST Contributions to CDC Performance Tests 0.9 2011-05-24 I&IMS Contribution to Chapter 5. 1.0 2011-05-30 FAGOR,

UNIBO Contributions to Chapter 2 and 3.

1.1 2011-05-31 MU UNIBO ACCIONA, PCL

Final contributions to chapters 2,3 and 4

1.2 2011-06-01 FAGOR First release

Page 3: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 3

Summary The “Demo Lab testing Methodology” is a document delivered in the context of WP7, Task 7.1: “Identification of test and test´s methodology (oriented to Cell area)” regarding the development of the test methodology in order to validate the technical developments integrated in WP5. The test methodology defined will be oriented to demonstrate the new embedded technologies carried out in the eDIANA project, the interoperability between eDIANA components and legacy installations, the user accessibility to diverse user profiles and the optimization of the energy efficiency, introducing this new functionality taking into account information obtained from the IP Cameras. In this context, two facilities will be defined in order to deploy tests during the testing phase: a communication testbed in order to implement the eDIANA scenario in a down-scaled way, reproducing the home environment for the communications protocol defined along the project; and a laboratory equipped with typical home installations, reproducing key factors defined such as energy consumption, generation and metering, with the aim to test the load management functionality performed by the Cell device Concentrator (CDC) and based on power availability information provided by the Macrocell Concentrator (MCC). The tests methodology developed will be immediately applied during the testing phase, in order to perform pre-industrial test focusing on the validation of the eDIANA technology and providing feedback for the WP8 where different real-scale demonstrator will be considered.

Page 4: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 4

Contents SUMMARY ..................................................................................................... 3

ABBREVIATIONS .......................................................................................... 6

INTRODUCTION ........................................................................................... 7

1. SUMMARY OF RELEVANT EDIANA TASKS ................................................ 8

1.1 INPUTS FROM WP2, WP3 AND WP4 .................................................................. 8 1.1.1 Cell Device ................................................................................................... 8 1.1.2 Inter-Cell network ......................................................................................... 9

1.2 INPUTS FROM WP5 ........................................................................................ 9

1.3 INPUTS FROM WP6 ....................................................................................... 11

2. CELL LOAD MANAGEMENT .....................................................................14

2.1 DESCRIPTION OF THE FACILITIES ...................................................................... 14

2.2 ASPECTS TO TAKE INTO ACCOUNT IN A TESTING METHODOLOGY ............................... 15 2.2.1 Lifecycle ..................................................................................................... 16

2.2.1.1 Test types ......................................................................................................... 17 2.2.1.2 Test scope or level ............................................................................................. 18 2.2.1.3 Activities ............................................................................................................ 19

2.2.2 Techniques ................................................................................................ 20 2.2.2.1 Test strategy techniques .................................................................................... 20 2.2.2.2 Test design techniques ....................................................................................... 20

2.2.2.2a Characteristics of test design techniques ..................................................................... 21 2.2.2.2b Examples of test design techniques ............................................................................ 22

2.2.3 Infrastructure ............................................................................................. 23 2.2.3.1 Environment ...................................................................................................... 23 2.2.3.2 Tool .................................................................................................................. 25 2.2.3.3 Test Automation ................................................................................................ 26

2.2.4 Organization ............................................................................................... 26

2.3 TEST METHODOLOGY .................................................................................... 27 2.3.1 Lifecycle ..................................................................................................... 27 2.3.2 Techniques ................................................................................................ 28

2.3.2.1 Test basis .......................................................................................................... 28 2.3.2.2 Applying CTM technique ..................................................................................... 29

2.3.3 Infrastructure ............................................................................................. 35 2.3.3.1 Test Environment ............................................................................................... 35 2.3.3.2 Tools ................................................................................................................. 36 2.3.3.3 Organization ...................................................................................................... 37

3. COMMUNICATIONS TECHNOLOGIES .....................................................38

3.1 DESCRIPTION OF THE FACILITIES ...................................................................... 38

3.2 TEST METHODOLOGY .................................................................................... 42 3.2.1 Non-Functional Tests .................................................................................. 42

3.2.1.1 Load and Stress Tests ........................................................................................ 42 3.2.1.2 Security Tests .................................................................................................... 45 3.2.1.3 Resources Tests ................................................................................................. 47

4. GRAPHICAL USER INTERFACE AND USER ACCESSIBILITY TESTS ........50

Page 5: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 5

4.1 GUI TEST GOAL .......................................................................................... 50

4.2 GUI TEST METHODOLOGY .............................................................................. 50 4.2.1 Participants ................................................................................................ 51 4.2.2 Informed consent ....................................................................................... 51 4.2.3 Test facilities and materials .......................................................................... 51 4.2.4 Measurements ............................................................................................ 52 4.2.5 Analysis ..................................................................................................... 52 4.2.6 Results....................................................................................................... 52

4.3 TEST SCENARIO ........................................................................................... 53 4.3.1 Pre-test ...................................................................................................... 53 4.3.2 Actual test .................................................................................................. 54

4.3.2.1 Accessibility ....................................................................................................... 54 4.3.2.2 Understanding ................................................................................................... 58 4.3.2.3 Satisfaction ........................................................................................................ 62

4.3.3 Debriefing .................................................................................................. 62

5. ENERGY EFFICIENCY OPTIMIZATION (WITH INFORMATION FROM THE IP CAMERA) ................................................................................................63

5.1 OVERVIEW - CONSIDERATIONS ABOUT THE ENERGY EFFICIENCY OPTIMIZATION ............ 63 5.1.1 Functionalities of Energy Efficiency Optimization. ........................................... 65 5.1.2 Design of the Energy Efficiency Optimization module ..................................... 67

5.2 TEST METHODOLOGY .................................................................................... 68 5.2.1 Level 1. Firmware of the IP camera .............................................................. 69

5.2.1.1 Level 1. Validation platform ................................................................................ 69 5.2.1.2 Level 1. Tests .................................................................................................... 70

5.2.2 Level 2. Communication between IP camera – SPEAr600 .............................. 71 5.2.2.1 Level 2. Validation platform ................................................................................ 72 5.2.2.2 Level 2. Tests .................................................................................................... 72

5.2.3 Level 3. Data stored into DB of SPEAr600 system .......................................... 72 5.2.3.1 Level 3. Validation platform ................................................................................ 73 5.2.3.2 Level 3. Tests .................................................................................................... 73

5.2.4 Level 4. Energy efficiency control functionality............................................... 74 5.2.4.1 Level 4. Validation Platform ................................................................................ 74 5.2.4.2 Level 4. Tests .................................................................................................... 78

CONCLUSIONS ...........................................................................................80

ACKNOWLEDGEMENTS...............................................................................81

REFERENCES ..............................................................................................81

APPENDIX A: UI INFORMED CONSENT FORM ...........................................82

APPENDIX B: DEBRIEFING QUESTIONNAIRE ...........................................83

APPENDIX C: CDC PERFORMANCE ............................................................85

Page 6: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 6

Abbreviations eDIANA Embedded Systems for Energy Efficient Buildings EDP eDIANA platform iEi Intelligent Embedded Interface CDC Cell Device Concentrator UI User Interface GUI Graphical User Interface SUT System Under Test ECU Electronic Control Unit EMS Energy Management System PER Packet Error Rate V&V Verification and Validation

Page 7: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 7

Introduction The purpose of Task T7.1 is to define the test methodology to be carried out in order to validate the eDIANA technology. Four main areas of work will be considered: Cell load management, Communications technologies, Graphical user interface and Energy efficiency optimization. Chapter 1 summarizes the developments carried out in previous work packages closely related to the work in this deliverable. These are the integration of the different components in WP5 at the Cell and MacroCell level, and the implementation of a testing and V&V methodology reported in WP6. Chapter 2 develops the test methodology for the cell load management functionality, starting from the definition of the aspects to be considered in a testing methodology and considering the reference architecture defined for the different component types inside eDIANA Cell level. Chapter 3 focuses on the wireless communication technologies, providing tests to validate critical aspects that define communications such as range, channel access, latency and PER and routing performance. Chapter 4 describes the graphical user interface tests, identifying the impact of the eDIANA platform on the user behaviour regarding the energy management, focusing on accessibility, user understanding of the information and satisfaction. Chapter 5 introduces the energy efficiency optimization module, describing its functionality and design, implementing finally the test methodology to verify and validate this module. Finally, a summary of this deliverable and future works are reported in the Conclusions Chapter. A brief description of a test to validate the CDC Performance has also been added in appendix C.

Page 8: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 8

1. Summary of Relevant eDIANA Tasks

1.1 Inputs from WP2, WP3 and WP4

1.1.1 Cell Device Cell is a domain including the collection of devices that operates in different scenarios: house, office, shop, etc. In terms of energy-efficiency, the Cell is in charge of all dynamic control and handling of the connected devices (e.g. appliances). In terms of supervision and control, the Cell has dynamic control on the devices attached to its concentrator (plug&play, discovery, etc). Obviously, the control capabilities of the Cell are subordinate to the end user choices. Each user, accordingly with his(her) preferences, configures the Cell in order to accept or deny the energy saving suggestions coming from the MacroCell. In Figure 1 a schematic scenario is reported. It shows the hierarchical relations between the Cell and the MacroCell and between the CDC and its components. Considering a bottom-up approach and description, Cell Level integrates all components that interact with the building elements and devices: appliances, lighting, HVAC, etc. In eDIANA Reference Architecture the next component types inside eDIANA Cell Level are defined:

Cell Device Concentrator (CDC)

Cell User Interface (CUI)

Cell Monitoring and Metering (CMM)

Cell Control and Actuation (CCA)

Cell Generation and Storage (CGS) More specifically, the last three component types undertake the task of obtaining all the information about the energy consumption of all elements of the building, communicating with the Cell Device Concentrator through the iEi (intelligent Embedded interface) and executing the orders that above components and layers define. Another component in the in the Cell level is the Cell User Interface. It communicates only with the CDC in order to provide Cell level information to the user of the platform.

Page 9: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 9

1.1.2 Inter-Cell network The eDIANA intra-Cell network provides connectivity between eDIANA devices and the CDC. The CDC will act in the architecture as a concentrator of the data coming from eDIANA devices. IEEE 802.15.4/Zigbee has been selected within the Task 2.3 as the most suitable technology, for reasons mainly related to the plethora of products already available on the market, and because of the fulfilment of the system requirements set by the applications.

Figure 1 Schematic building scenario

1.2 Inputs from WP5 The main objective of WP5 was the integration of all the Cell and MacroCell level designs and developments described in the eDIANA Reference Architecture. The work was carried out in 3 different tasks, covering each of them a specific level of the integration. The integration of the different elements configure a functional eDIANA platform which will be the basis for the development of the pre-industrial demonstrators to be carried out in this WP7 and also for the real-scale demonstrator to be deployed in WP8. In the context of WP5, Task 5.1 carried out the individual eDIANA components developed in WP3 and WP4, considering the eDIANA integration domains specified appliances, sensors and actuators, display devices and MacroCell integration layer.

Page 10: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 10

Task 5.2 performed the integration of the different devices considered at the cell level with the eDIANA Cell device Concentrator (CDC) and checked the correct integration, taking into account the rules defined for the eDIANA platform, showing finally the obtained results. Task 5.3 covered the definition and the description of the results of the MaccroCell integration process, including the 3 main modules defined in previous work packages, the MacroCell Level Concentrator, the Data Gathering Component and the Control Strategies. The MacroCell concentrator is, also, composed by three main elements, the MacroCell Database, the Web Server and the User Interface.

Figure 2 eDIANA CDC and MCC architecture

Page 11: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 11

1.3 Inputs from WP6 The main goal of WP6 was the development of a toolkit for early V&V and testing, as well as the management of V&V requirements. The work of WP6 was split into 3 tasks, each of which has dealt with different aspects of systems V&V. The techniques and tools developed in WP6, along with other tools selected in WP6 deliverables provide a good starting point for the implementation of a testing and V&V methodology. In particular, task T6.1 dealt with non-functional properties verification in eDIANA devices. The main objective of this task was the development of a model-driven methodology that enabled designers to include non-functional aspects into design models. This approach enables the connection of design models with external V&V tools, like schedulability analysis tools, performance analysis tools, etc. The approach is also valid in the testing phase, enabling the validation of the initial designs and detecting how variation points affect the non-functional properties of the system.

Figure 3: GUI of the model-driven V&V tool developed in T6.1

Task T6.2 dealt with low-level testing techniques and V&V requirements management. Regarding low-level testing, WP6 partners developed a Java runtime

Page 12: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 12

environment for SPEAr to allow early testing of the final algorithms in PC platforms. Additionally, some model-based functional testing techniques where discussed. Regarding V&V requirements management, T6.2 developed a methodology to take advantage of the functionality of HP Quality Center tool for the management of these requirements. This methodology covered the description of these requirements and the traceability between requirements and test scenarios. Lastly, task T6.3 dealt with the link between V&V requirements and certification requirements. The main goal of this task was the development of a tool that supported the management of the certification requirements. Certification requirements include, not only functional and non-functional requirements of the systems; but also procedural aspects of the development life-cycle. Thus, T6.3 also discussed the most relevant standards for (business) process modelling, defining the most important characteristics of each modelling standards and their capabilities regarding certification processes modelling. The prototype for certification requirements modelling developed as main result of T6.3, has been also discussed to be used for the implementation of this testing methodology, describing the testing requirements and identifying the testing evidence. In the last stages of WP6, the HP Quality Center tool was selected for the implementation of the testing methodology of WP7. HP Quality Center, is a web based Test management tool, composed by five main modules for management of testing processes. Even though just three modules will be used in eDIANA:

Requirements, which is used for requirements management and requirements traceability through test cases stored in the HP-QC repository.

Test Plan, which is used for creating or updating different Test Cases. The Test Cases are contained in different folders which are displayed in a tree like structure. It can store both Manual as well as automated test cases. Manual Test Cases can be written locally or imported from Excel Sheets. With each 'Test Step' having Expected Result and ActualResult section. QC supports automated script developed for different Automation Tools like QTP, LoadRunner, WinRunner etc. These scripts can be saved directly from the Tool into the Test Plan tab of QC. However, prior to this, appropriate QC Add-in needs to be installed to support an Automation Tool.

Test Lab, this module is for execution of the test cases stored in the Test Plan

module which can be imported locally to the Test Lab screen and Run. When Manual Test case is executed, it opens up a pop up listing all the Test Steps and the user is supposed to update status of each step with Passed, Failed or Not Complete. When automated test case is run, QC invokes the Automation

Page 13: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 13

Tool which in turn executes the script and stores back the result into QC repository and displays on the UI.

The main reason to select this tool was its integrability with Rational RequisitePro, the tool used for capturing requirements in eDIANA. As shown in the figure, the HP QC Synchronizer allows the interconnection of these two tools.

Figure 4: Requirements sync HP QC with RequisitePro

Page 14: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 14

2. Cell Load Management

2.1 Description of the Facilities The pre-industrial demonstrator for cell load-management is located in IKERLAN‟s facilities in Mondragon (Spain). This demonstrator is located in a demo-lab, which is basically a small apartment of 40 m2 consisting of three spaces: kitchen plus dining room, living room and toilet.

Figure 5 Demo-lab in IKERLAN

The following elements will be tested in this demonstrator:

Figure 6 Configuration of the demonstrator

Page 15: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 15

The demonstrator will consists on a dishwasher and an oven as home appliances that consume energy; and a Stirling that can produce energy. The following tables summarize the technical specifications of the oven and the dishwasher.

. Total Power (kW) 3.57

. Connection 230 V / 50 Hz

. Fuse rating (Amp) 16

. Energy Class A

. Conventional energy consumption (kWh) 0.79

Table 1 Technical data of oven

. Total Power (kW) 2.17

. Connection 230 V / 50 Hz

. Fuse rating (Amp) 10

. Energy Class A

. Conventional energy consumption (kWh) 0.94

Table 2 Technical data of dishwasher

There will be also a fiscal meter that will measure the real consumption of energy. And uncontrolled load will be simulated with a resistance.

2.2 Aspects to Take into Account in a Testing Methodology In order to define the requirements for the testing methodology, we have followed the four cornerstones of structured testing proposed by [3]:

Lifecycle

Techniques

Infrastructure

Organization

Page 16: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 16

2.2.1 Lifecycle This part deals with the process of developing and testing embedded software. A lifecycle structures that process by dividing it into phases, describing which activities need to be performed, and in what order. In the lifecycle model, the principal test activities are divided into five phases:

Planning & control

Preparation

Specification

Execution

Completion

Figure 7: Phases of testing from [3]

For organizing the testing, a master test plan helps a lot. It provides the overall picture of all relevant test activities and issues, and how they are related.

Page 17: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 17

2.2.1.1 Test types The software development systems are tested with different purposes. They can be categorized into:

Structural Tests: They cover the structure of the SUT during test execution. The internal structure of the system must be known (white-box tests).

Functional Tests: Functional testing is concerned with assessing the functional behavior of an SUT against the functional requirements. They do not require any knowledge about system internals (black-box tests).

Non-functional Tests: Similar to functional tests, they are performed against requirements specification of the system for assessing non-functional requirements such as reliability, load, or performance requirements.

In Figure 8, a broader classification of test types is shown.

Figure 8: Test types from [3]

Page 18: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 18

2.2.1.2 Test scope or level Test scopes describe the granularity of the SUT. Due to the composition of the system, tests at different scopes may reveal different failures. Therefore, they are usually performed in the following order:

Unit/Component Testing: Unit testing verifies the functioning in isolation of software pieces which are separately testable. Depending on the context, these could be the individual subprograms or a larger component made of tightly related units.

Integration Testing: Integration testing is the process of verifying the interaction between software components.

System Testing: System testing is concerned with the behaviour of a whole system.

In the Figure 9, a broader classification of test levels is shown:

Figure 9: Test levels from [3]

Page 19: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 19

2.2.1.3 Activities Some of the typical activities to be performed for testing are the following ones [3]: The planning and control phase consists of the following activities:

1. Formulating the assignment: The objective is to determine who the commissioner and the contractor are, what the scope and the objective of the test process is, and what the preconditions are for the test process.

2. Global review and study: The objective of this activity is to gain insight into

the available system and project documentation, the demands made on the system in terms of functionality and quality, the organization of the system development process, the available knowledge and experience in the field of testing, and, with respect to the acceptance test, the user demands.

3. Establishing the test basis: The objective of this activity is to determine

what the test basis and the basic documentation are.

4. Determining the test strategy: The objective of this activity is to determine what to test, how to test it, and with what coverage.

5. Setting up the organization: The objective of this activity is to determine

how the organization of the test process will be set up: roles, tasks, authorities, responsibilities, hierarchy, consultation structures, and reporting lines. The need for training will also be considered.

6. Specifying test deliverables: The objective of this activity is to specify the

products to be delivered by the test team.

7. Specifying the infrastructure: The objective of this activity is to determine at an early stage the required infrastructure for the test process.

8. Organizing management and control: The objective of this activity is to

describe how management and control of the test process, the infrastructure, and the test deliverables will be arranged.

9. Scheduling the test process: The objective of this activity is to give a

detailed description of time, money, and personnel needed for the test activities.

10. Consolidating the test plan: The objective of this activity is to record the

results of the activities carried out so far, and to acquire formal approval from the commissioner.

Page 20: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 20

The following activities can be identified within the framework of the coordination, monitoring, and control of the test process:

1. Maintaining the test plan: The objective of this activity is to keep the test plan and the overall schedule up to date

2. Controlling the test: The objective of this activity is to control the test

process, infrastructure, and test deliverables, in order to be able to provide constant insight into the progress of the test process and the quality of the test object.

3. Reporting: The objective of this activity is to provide the organization with

information about the progress of the test process and the quality of the system under test.

4. Establishing detailed schedules: The objective of this activity is to set up

and maintain the detailed schedules for the various phases: preparation, specification, execution, and completion.

2.2.2 Techniques A technique basically describes how a certain activity must be performed. In testing context, different techniques can be identified for each activity or phase: review techniques, safety analysis techniques, test strategy techniques, test design techniques: how to derive test cases from certain system specification, etc. The last two types of techniques will be further described:

2.2.2.1 Test strategy techniques A test strategy describes the choices which have to be made in respect of which parts of the system have to be tested thoroughly and which less thoroughly:

Determine what to test

Determine how to test

Determine with what coverage

2.2.2.2 Test design techniques A test design technique is a standardized method of deriving test cases from reference information.

Page 21: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 21

2.2.2.2a Characteristics of test design techniques Some of the characteristics to be considered when selecting the test design techniques are the following [3]:

Black-box or white-box: Black-box techniques treat the system as a "black-box", so it does not explicitly use knowledge of the internal structure; it is usually oriented to testing functional requirements. White-box techniques allows using internal knowledge of the software to guide the selection of test data; it is also known as structural testing.

Principles of test case derivation

o Processing logic: to base test cases on detailed knowledge of the

logic of the processing of system to be tested. o Equivalence partitioning: the input domain (all possible input

values) is partitioned into “equivalence classes.” For all input values in a particular equivalence class, the system shows the same kind of behaviour (performs the same processing).

o Boundary value analysis: It is a specialization of the above principle. The values that separate the equivalence classes are referred to as boundary values.

o Operational usage: The test cases are then designed to simulate the real-life usage.

o CRUD: It is used when system behaviour is centred on the lifecycle of data (create, read, update, and delete).

o Cause–effect graphing: It is a technique for transforming natural language specifications into a more structured and formal specification. It is particularly suitable for describing the effect of combinations of input circumstances: causes (input condition) and effects (output conditions).

Formal or informal: A formal test design technique has strict rules on how the test cases must be derived whereas an informal test design technique gives general rules and leaves more freedom

Application areas: Some test design techniques are particularly suited to testing the detailed processing within one component, while others are more suited to testing the integration between functions and/or data or to test the interaction between systems and the outside world (users or other systems).

Quality characteristic to be tested: Functionality, reliability, performance, etc.

Page 22: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 22

Required type of test basis: A test design technique is, by definition, a standard way of deriving test cases from a test basis and it may require a specific type of test basis.

2.2.2.2b Examples of test design techniques

Examples of test design techniques are the following [3]:

State transition testing: It is used in embedded systems that use state-based modelling is used. Those models are used as a basis for test design. The purpose of the state-based test design technique is to verify the relationships between events, actions, activities, states, and state transitions.

Control flow test: It is oriented to test the program structure. The test cases are derived from the structure of an algorithm and/or program. Every test case consists of a group of actions covering a certain path through the algorithm. The control flow test is a formal test design technique mostly used in unit tests and integration tests.

Elementary comparison test (ECT): It is used to test the processing in detail. The test verifies all the functional paths of a function. All functional conditions have to be identified and translated into pseudocode. The test cases are derived from the pseudocode and cover the identified functional paths. The ECT establishes condition coverage and guarantees a reasonable degree of completeness. Because this formal technique is labour intensive, it is mainly used in very important functions and/or complex calculations.

Classification-tree method (CTM): It supports the systematic design of black-box test cases. It is an informal test design that identifies the relevant aspects of the system under test – aspects that can influence, for instance, the functional behaviour or safety of the system. The input domain of the test object is partitioned in disjoint classes using equivalence partitioning according to the aspects identified.

Evolutionary algorithms: Evolutionary algorithms (also known as genetic

algorithms) are search techniques and procedures based on the evolution theory of Darwin. Whereas test design techniques focus on individual test cases, evolutionary algorithms deal with “populations” and the “fitness” of individual test cases. Evolutionary algorithms can, for instance, be used to find test cases for which the system violates its timing constraints, or to create a set of test cases that cover a certain part of the code.

Page 23: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 23

Statistical usage testing: It uses operational profiles to produce a statistically relevant set of test cases to test the operational usage of a system.

Rare event testing: It is used to test events that happen very infrequently.

Mutation analysis: In this type of testing, faults are introduced into a

system under test to detect them.

2.2.3 Infrastructure This section refers to the facilities that are required for testing. It includes those needed for executing tests (test environment) and those that support efficient and effective execution of test activities (tools and test automation).

2.2.3.1 Environment For performing the testing, specific test facilities are required depending on the stage. Embedded systems are developed and tested in stages: starting with simulated parts (simulation stage) and then replacing them one by one by the real thing (prototyping stage) until finally the real system works in its real environment (pre-production stage) [3], see Figure 10.

Figure 10: Testing in different stages (from simulation to real) [3]

Page 24: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 24

In the simulation stage, executable simulation models of the systems are tested. In the prototyping stage, the following test levels are applicable:

Software unit test

Software integration test

Hardware/software integration test

System integration test

Environmental test

In the pre-production stage, the following test types are applicable:

System acceptance test

Qualification tests

Safety execution tests

Tests of production and maintenance test facilities

Inspection and/or test by government officials

Related to simulation stage we found the following test platform [1]:

Model-in-the-Loop (MiL): The first integration level, MiL, is based on the model of the system itself. In this platform the SUT is a functional model or implementation model that is tested. Model exists entirely in native simulation tool (e.g., Simulink / Stateflow). The test purpose is basically functional testing in early development phases in simulation environments.

Related to prototyping stage we found the following test platform [1]:

Software-in-the-Loop (SiL): During SiL the SUT is software tested. The software components under test are usually implemented in C and are either hand-written or generated by code generators based on implementation models. Part of the model exists in native simulation tool (e.g., Simulink / Stateflow), and part as executable C-code (e.g., S-function). The test purpose in SiL is mainly functional testing.

Processor-in-the-Loop (PiL): In PiL embedded controllers are integrated into embedded devices with proprietary hardware (i.e., ECU). Testing on PiL level is similar to SiL tests, but the embedded software runs on a target board with the target processor or on a target processor emulator. It is the last

Page 25: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 25

integration level which allows debugging during tests in a cheap and manageable way.

Hardware-in-the-Loop (HiL): When testing the embedded system on HiL level

the software runs on the final ECU. However the environment around the ECU is still a simulated one.

Related to pre-production stage we found the last integration level:

System: Finally, the last integration level is obviously the system itself.

Figure 11 shows another view of the four testing configurations previously mentioned: Model-, software-, processor- and hardware-in-the-loop (MIL, SIL,PIL and HIL) which is called X-in-the-loop testing.

Figure 11: Different test platforms [2]

2.2.3.2 Tool Tools are available for every phase in the test lifecycle. The following figure shows how tools can be categorized according to where they are applied in the testing lifecycle [3].

Page 26: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 26

Figure 12: Tool classification (from [3])

2.2.3.3 Test Automation Test automation consists on automating the execution of tests. It is used, for instance, to minimize the time needed for test execution.

2.2.4 Organization Organization is all about people, about those who perform all the testing activities, and their relationships and interactions with the others they have to deal with. The following topics are covered:

Test roles: To perform the various test activities in a test project, specific knowledge and expertise must be available in the test organization. Some examples are: Test engineer, Team leader, Test manager, Test policy manager, Methodology Support, Technical Support, Domain expert, Intermediary, Test configuration manager, Application integrator, Test automation architect, Test automation engineer, Safety manager and Safety engineer.

Page 27: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 27

Organization structure: There are different ways of implementing the test organization, both in the line organization and in a project setting.

Test control: The test team must be able to provide information about status and trends regarding progress and quality. This requires that staff follow certain agreed procedures.

In the technical annex, the challenges in WP7 mentioned are centred on achieving/demonstrating:

New embedded technologies developed

Interoperable between eDIANA components and legacy installations

Energy management strategies

User Accessibility (ease of use, adaptability to diverse user profiles) without losing the support for safe, secure, maintainable, reliable, cost efficient and timely system services.

From that, we can extract which are the quality characteristics to test:

Functionality

Interoperability

Connectivity

Usability

Accessibility

...

Depending on the part or subsystem to be tested: load management, graphical user interface, energy efficiency optimization, CDC performance, communications ... the quality characteristics to test vary. In the communications one of the most important quality attribute is connectivity, in user interfaces usability and accessibility, and in Cell load management functionality.

2.3 Test Methodology Following the aspects of section 2.2, each of the cornerstones of a structured testing methodology: lifecycle, infrastructure, techniques and organization are defined.

2.3.1 Lifecycle Test type: Functionality testing In WP7, functional testing is the main goal but non-functional testing such as usability testing is also required. But for this part (the load management testing) functionality is the most important part.

Page 28: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 28

Test goal or level: Integration and system testing Unit testing level has been already performed in WP3. And integration testing level has been performed in WP5 at least partially. And now integration/system testing must be performed.

2.3.2 Techniques Selected test design technique: Classification-tree method The classification-tree method is a testing method for the systematic design of test cases on basis of the specification. The classification-tree method (CTM) supports the systematic design of black-box test cases. It is an informal test design technique, although there is a well-described procedure to derive test cases. CTM identifies the relevant aspects of the system under test: aspects that can influence the functional behaviour or safety of the system. The input domain of the test object is partitioned in disjoint classes using equivalence partitioning according to the aspects identified. The input domain partition is represented graphically in the form of a tree. The test cases are formed by a combination of the classes of different aspects. This is done by using the tree as the head of a combination table in which the test cases are marked. CTM is used with the functional requirements as test basis.

2.3.2.1 Test basis The functional requirements from WP1 (D1.3-C) have been analyzed to see which ones must meet the energy management part. The main requirement that will be tested in this part of the system is the following one:

Req. 2.2.3.8 - Energy consumption management: The eDIANA system enables users to manage energy consumption in a sustainable way.

The Energy management system must take into account end user preferences, tariffication and price information, power consumption, priorities, etc.:

Power consumption and production data:

o Req. 1.7.2.7 - Availability of Cell-level Devices power Consumption/Production Data in near-real-time.

o Req. 2.1.1.1.1 - Energy & load info: Every appliance should be able to provide and transmit information on their energy consumption or power load

Page 29: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 29

o Req. 2.2.3.9 - The eDIANA system should be able to register energy delivery

o Req. 2.2.3.10 - Energy production registration

End user preferences:

o Req. 2.2.2.1.1 - End user preferences: The system must act according to the end-user preferences.

o Req 2.2.2.1.2 - Possibility to overwrite temporarily End User Preferences

Tariffication and price information

o Req. 2.2.2.1b.1 - Take advantage of tariffication options: Utilities will offer different tariffication options to their customers depending on the time of the day, or at demand peaks. Some appliances could delay their start, lower the consumption, etc. when the price is higher.

o Req. 2.2.3.14 - Pricing information: The eDIANA system shall support various schemes for pricing.

Priorities of home appliances:

o Req. 2.2.5.1 - Handle priorities of electrical equipment: Handle priorities of electrical equipment in a case of a price signal (from MCC): to turn off only non-essential equipment and not to turn off all equipment (could be a matrix with a priority value for each device).

And will apply different mechanism to obtain a sustainable energy consumption management:

Req. 2.2.3.11 - Automatic energy consumption: The eDIANA system shall provide mechanism to switch on/off equipment or suspend energy consumption

Req. 2.2.2.2.a.4 - Full blocking of washing process

Req. 2.2.2.2.a.5 - Automatic restart of washing process

Req. 2.2.2.2.a.6 - Pause during washing process

Req. 2.2.2.2.a.9 - Switch off appliance temporarily

Checking the scenarios in deliverable D1.4-B, the scenario that will be tested is the following:

o The EMS enables programming of automated performance during certain periods of time. (See 4.1.1 and 4.1.4 of deliverable D1.4-B)

2.3.2.2 Applying CTM technique The CTM technique consists of the following steps:

1. Identifying aspects of test object 2. Partitioning input domain according to aspects 3. Specifying logical test cases 4. Assembling the test script

Page 30: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 30

Identifying aspects of test object The first step is the identification of the aspects that influence the processing of the system under test. In this case the identified aspects are the following ones: Priority of the home appliances: Priority (Pry) can have the following values:

Pry < 40: the device has high priority. This means that if a device with this priority asks to be active and there is not enough power, first the devices with low priority will be switched off and then the devices with medium priority will be postponed (or switched off if they cannot be postponed). If after these actions there is not enough power for the device, it will not be switched on.

40 < Pry < 80: the device has medium priority. This means that in case of insufficient power, the algorithm can postpone the device (or even switch off if it is not possible to postpone it).

Pry > 80: the device has low priority. The algorithm can switch off the device if power is needed for another device (it is not postponed).

If there are two devices with the same priority, the one that prevails will be the one with the lowest “Pry” value. The “on-off time periods” preferences can be applied to devices with medium and low priority. In this case, the following fixed priorities have been specified for testing:

Stirling boiler: High

Oven: High

Dishwasher: Medium

Stirling Boiler

If the Stirling boiler is generating energy or not. The Stirling boiler is a water heater that will be switched on when the user wants hot water, and it will generate energy at the same time. The energy profile of the Stirling is the following: it consumes some energy at the beginning and then it starts to generate energy progressively until it reach a stable energy generation of 1kW. For testing, a stable generation of 1kW will be considered.

Page 31: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 31

Oven and dishwasher

The oven or dishwasher can be ON (working) or OFF. And it can be also a petition for switching it on. They have different programs. For instance, the dishwasher has seven different programs but they are quite different except for the cold program, the rest of programs have the same consumption limits (2,2kW), the temporization can be different. For testing, the most used program will be used. Time preferences (“on-off time periods”)

Different on-off time periods can be defined (until 5). In this case, for testing the following preferences will be used:

Night. It coincides with cheaper energy price.

Day. It coincides with more expensive energy price.

All day: unique price the whole day

Uncontrolled load

The uncontrolled load can have different values. For testing three values will be considered:

No uncontrolled loads (0 kW)

Uncontrolled load that has not impact to switch on the dishwasher: < 1 kW

Uncontrolled load that has an impact to switch on the dishwasher: 1,5 kW

The power limit

The hired power limit can be different. For testing purposes, a fix power limit will be considered: 3,3 kW. This limit is enough low to have to delay the dishwasher if the oven is on. Energy price (Tariffication rates)

Different tariffications can exist. In this case two possible tariffications will be considered:

Uniform value: The price of the energy is uniform all the day.

Variable profile (day/night): The price of the energy is different depending on the hour: From 0:00-7:00 cheaper, from 7:00 to 17:00 more expensive and from 17:00-24:00 cheaper again.

Page 32: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 32

Partitioning input domain according to aspects The input domain of the test object is partitioned into classes according to the aspects identified. For every member of a class, the system shows the same kind of behavior. This partitioning method is called equivalence partitioning . The classes are disjoint, meaning that every input value can only be a member of one class and exactly a member of one class. Figure 13 shows the result of the partitioning into classes.

Figure 13: Classification tree

CTE Classification-tree Editor tool (http://www.berner-mattner.com/en/berner-mattner-home/products/cte-xl/) has been used to apply this technique and create the classification-tree. Specifying logical test cases The CTE tool can be also used to define or generate the following criteria for the test cases (see Figure 14). In this case, test cases have been generated using pairwise to cover all the combination of two classes.

Figure 14: The Classification tree and test cases

Page 33: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 33

Test coverage

The tester must define the number of test case specification but some reference numbers can be considered:

The minimum criterion: The number of test cases to include at least one each leaf class. In the previous example, following this criterion the number of test cases is 3 (all leaf classes appear at least in one test case)

The maximum criterion: The number of test cases that results from each possible combination. In the previous example, following this criterion the number of test cases is 234 (all the combinations).

A reasonable number of test case specification is between the minimum and maximum criterion. As a rule of thumb, the total number of leaf classes gives an estimate for the number of test cases required to get sufficient test coverage [4]. In this case, we have used a pair wise criterion to select the test cases that regarding bibliography will help to obtain a test coverage more than acceptable. Moreover, weights can be assigned to pairs that are considered important in order to introduce these pairs in more test cases. For instance the following pairs:

The uncontrolled load will not impact because the Stirling is on: o Uncontrolled load 1,5kW o Stirling on

The uncontrolled load will impact because the Stirling is off: o Uncontrolled load 1,5kW o Stirling off

The home appliance will be switched in the most expensive period: o Time preferences: Day o Energy price: Variable profile (day/night)

The home appliance can be switched on in the cheapest period: o Time preferences: Night / All day o Energy price: Variable profile (day/night)

The energy prices is the same all the time o Time preferences: Day / Night / All day o Energy price: Uniform value

Not enough energy when switching on a device, dishwasher is delayed: o Oven working o Dishwasher petition

Enough energy when switching on a device: o Oven off o Dishwasher petition

Need of switching off the dishwasher (lack of power) o Dishwasher working o Oven petition

Need of switching off the dishwasher (lack of power)

Page 34: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 34

o Dishwasher working o Uncontrolled load 1,5 kW + Stirling off

… From the previous pairs the most important pairs have been prioritized and other ones eliminated because are impossible or not interesting to test:

Prioritized pairs: o Oven working + Dishwasher petition o Oven petition + Dishwasher working o Dishwasher petition + Time Preferences: Night o Uncontrolled load 1,5 kW + Dishwasher petition o Uncontrolled load 1,5 kW + Dishwasher working

Eliminated pairs: o Dishwasher no + Time Preferences (all) o Dishwasher no + Oven no o Dishwasher working + Oven working o Dishwasher petition + Oven petition: Two simultaneous petitions will

generate two calls to the algorithms. It can be test separately o Stirling generating + Uncontrolled load 1 kW

Assembling the test script Following the pair-wise criterion described in the previous section and weighting those pairs that are considered most important, the test cases detailed in the following tables have been obtained.

Table 3 Test cases using the Stirling boiler as an input

Page 35: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 35

Table 4 Test cases using the Stirling boiler as an output

Those pairs with highest priority have been highlighted in the previous tables, and will be considered for the testing phase in Task T7.2. For obtaining the test cases with prioritized pairs another tool has been used [5]

2.3.3 Infrastructure

2.3.3.1 Test Environment The simulation stage was performed at WP2, WP3 and WP6. And now, at WP7, we are at prototyping stage where following test levels are applicable:

Software unit test: It has been already performed in WP3. Software integration test: It has been already performed in WP5. Hardware/software integration test: It has been performed in WP5. System integration test: to be performed at WP7. Environmental test: It will be performed at WP8.

The embedded system to be tested, as well as the plant or environment, is described in Figure 15.

Page 36: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 36

Figure 15: Embedded system to be tested

In WP7, the purpose is to put together and test in a demo laboratory. And to test at different levels can be required at SIL, PIL and HIL. Related to these levels, during testing the processor can be:

PC Emulator SPEAr processor

The rest of the embedded system, such as the zigbee communicator of the CDC will be a ZigBee Dodge (via USB) in the PC and also in the SPEAr processor. The plant or environment can be composed by:

Simulated home appliances Real home appliances

2.3.3.2 Tools The test cases have been generated using the tool in [5] that allows to prioritize pairs.

Page 37: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 37

2.3.3.3 Organization Who is going to test the energy management system? The following test roles may be involved:

Test engineer: The test engineer‟s tasks are the most creative, but also the most labor intensive. The role of test engineer is necessary from the preparation phase through to completion. Typical tasks: Review of the test basis (specifications), Specification of logic and physical test cases, and the start situation, Execution of test cases (dynamic testing), Static testing, Registration of defects, Archiving testware...

Methodology support: The role of methodology support involves rendering assistance concerning methodology in the broadest sense.

Test manager: The test manager is responsible for planning, management, and execution of the test process within time, budget, and quality requirements.

Team leader: A team leader manages a small team (from two to four) of testers. The role is involving from the preparation phase through to completion.

Domain expert: The domain expert renders assistance to the test process concerning functionality in the broadest sense. The role provides support in carrying out the activities where the functionality issues can be important. These activities may include test strategy development or risk reporting by test management, as well as the testability review, or design of test cases by a tester.

Technical support: The role of technical support involves rendering assistance concerning technology in the broadest sense. This role is responsible for making the high-quality test infrastructure continuously available to the test process. This secures the operability of test environments, test tools, and office requirements. The technical support role can be active during the whole test lifecycle.

Page 38: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 38

3. Communications Technologies

3.1 Description of the Facilities A ZigBee communications demonstrator will be installed at Electronics Computer Sciences and Systems (DEIS) building in University of Bologna. The Cell environment of the e-DIANA scenario will be reproduced in a down-scaled way (in terms of distances between devices, transmit power, etc.). By decreasing the transmit power, the range of the ZigBee nodes is going to be controlled in order to fit in a room a full functional test-bed. Having such an environment will improve the quality of the demonstration giving the observers all-in-one experience by showing the whole interaction in sight. There will be 20 nodes simulating several devices. 10 of those nodes will have graphic LCD display for the purpose of simulating the household devices. Since this test site will be focused on demonstration of the communication in the Cell, there won‟t be any real household/office appliances but ZigBee nodes will simply simulate the typical household appliances in order to demonstrate some reference scenarios described in “D1.4 Reference Scenarios” deliverable. In this demonstrator the end point configuration given in “D5.2-A Cell Integration Results” will be used with some minor modifications. In the coordinator there will be two ZigBee devices namely Combined Interface and Energy Service Portal while in the iEi there will be a Generic Device and a Metering Device. The default clusters coming with these devices are shown in Figure 16. Different than the D5.2-A clusters shown in red in Figure 16 will not be used because of not applicability in this test scenario. Standard security with non-preconfigured network key and Home Automation default trust center link key will be used in order to test the security in eDIANA scenarios. For the sake of simplicity types of appliances in the demonstration will be limited into three, as given below, in order to share common code between appliance as much as possible. No new cluster will be introduced but already defined clusters and attributes in ZigBee will be used to test the communication in the cell.

Page 39: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 39

Combined Interface (EP 8)

Servers

Servers

Clients

Clients

ZigBeeCoordinator(Code Size:

94.864 bytes,

3440 bytes available for

RAM)

Basic (M)Identify (M)Groups (M)

Scenes (M)OnOff (M)LevelControl (M)Temperature (O)OnOffSwitchConf (O)ThrmostUserIntConf (O)

KeyEstablishment (M)Simple Meter (O)

Basic (M)KeyEstablisment (M)Message (M)DemndRespnseLoad (M)Time(M)Price (M)Identify (O)SimpleMeter (O)

Energy Service Portal (EP 9)

ZigBee Test Client (ZTC)

ZigBeeRouter - iEi(Code Size:

94.080 bytes,4224 bytes

available for application and

RAM)

Servers

Clients

Basic (M)Identify (M)Groups (M)Appliance Dependent Clusters

Servers

Basic (M)KeyEstablisment(M)SimpleMeter (M) (No Get Profile Response and Cluster Optionals)Identify (O)

Generic Device (EP 8)

Metering Device (EP 9)

Clients

KeyEstablishment (M)Time (O)

Appliance Dependent Clusters

Figure 16 Endpoint Configuration with Default Clusters (clusters in red are not used)

Page 40: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 40

Heating or Cooling Capable Appliances Several devices such as fridge, water heater or HVAC can be realized as a generic heating or cooling device by implementing the flowchart shown in Figure 17.

Zigbee Coordinator

Thermostat

Cooling Unit Heating Unit

Desired Temp (DT)

On/Off Temp SensorOccupancy

Sensor

Local Temp

(LT)

Compare

DT < LT

Choose

Level control

Fan control

Yes

Instantaneous Energy

Consumption

Instantaneous Energy Consumption

Choose

Level control

Fan control

N

OnOn

Figure 17: Flow Chart of Heating or Cooling Capable Devices

Page 41: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 41

Programmable Appliances Programmable devices like washing machine, dishwasher, etc. can be realized in general as given in Figure 18.

ZigBee

Coordinator

Programmable

Appliance

Energy Price

Message

(Comsumption Profile)

Choose the

Programme

Start now or

let CDC

decide

Instantaneous Energy

Consumption

Start/Stop

Figure 18: Flow Chart of Programmable Devices On/Off Devices (Ex: On/Off Light) All other simple dummy devices can be realized as represented in Figure 19.

ZigBee

Coordinator

On/Off Device

On/OffIstantaneous

Energy

Consumption

Figure 19: Flow Chart of On/Off Devices

Page 42: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 42

3.2 Test Methodology Here in this part the communication platform is considered as a whole system independent from the rest of the eDIANA architecture this kind of scaling was necessary since the rest of the system sees the communication platform as a black-box that is responsible about the communication in the Cell. The ZigBee stack is a distributed software that runs on the iEis in order to realize mechanisms like self healing, routing and channel access defined in IEEE 802.15.4/ZigBee therefore the communication platform may not be easily divided into the units/components that can be tested individually. The communication platform can be seen as the SUT including ZigBee nodes that are working together to provide seamless communication inside the Cell. So tests in this part are all on the scope of system testing, they are particularly field tests. Integration and unit/component tests are not applicable in this scale because the ZigBee stack is provided as an object code by the transceiver manufacturer. The robustness, functionality and the reliability of the stack is the responsibility of the software provider, thus functional test are not easy to be performed. On the other hand non-functional test can be applied in order to observe the performance. Finally structural tests are not applicable in this scale because of non public availability of the internal structure of the ZigBee stack implementation.

3.2.1 Non-Functional Tests In order to reveal the non-functional behaviour of the communication platform load and stress, security and resources tests can be done.

3.2.1.1 Load and Stress Tests In different traffic conditions the response of the system to the traffic load can be measured to see the real-life behaviour of the ZigBee platform. A) Latency Latency is an important metric that defines the responsiveness of the communication platform. Processing capabilities of the relaying devices, communication protocol and the modulation in the physical layer influence the latency. Application may suffer from the excessive delays introduced by the network. Latency should be in tolerable limits in order to have seamless operation.

Page 43: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 43

Testing Activities:

Formulating the assignment The objective of the testing process is to measure the latency in different topology and network utilization in the scope of the eDIANA cell level.

Global review and study On the Cell level of eDIANA IEEE 802.15.4/ZigBee compatible transceivers are used for the communication. IEEE 802.15.4: It is a standard defined for the low-cost and low-power operation in personal area networks. It defines the PHY and MAC layers. 2.4 GHz ISM band channels use O-QPSK modulation. Channel access is based on CSMA-CA (http://www.ieee802.org/15/pub/TG4.html ). ZigBee: ZigBee specifications defines the network and application layers on the top of IEEE 802.15.4 layers (http://www.zigbee.org/). The Transceiver: The Freescale MC1322x family is Freescale‟s third-generation ZigBee platform which incorporates a complete, low power, 2.4 GHz radio frequency transceiver, 32-bit ARM7 core based MCU, hardware acceleration for both the IEEE 802.15.4 MAC and AES security, and a full set of MCU peripherals into a 99-pin LGA Platform-in-Package (PiP).

Establishing the test basis Relevant eDIANA Deliverables: D2.3-A Network topology and communications architecture definition D2.3-B Communication protocol specification D3.2-C Intelligent Embedded Interface (iEi).

Determining the test strategy Latency can be measured in different topologies and in different network utilization conditions.

Setting up the organization Personnel responsible about the deployment and maintenance of the network.

Specifying test deliverables The average latency in different conditions.

Specifying the infrastructure A ZigBee test-bed can be deployed in an office environment in order to measure the latency. Necessary equipment: ZigBee nodes, batteries, flash programmer. There could be fixed reference nodes and mobile nodes in order to obtain different topologies. Controlled network traffic can be generated by the nodes which results with different network utilization.

Page 44: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 44

Organizing management and control

Infrastructure would be deployed in an office/home environment and the measurements would be stored by the CDC.

Scheduling the test process The network should be active until statistically significant data is collected.

Controlling the test Software/Hardware problems that might occur during the network time must be controlled periodically in order to obtain reliable data.

Reporting Comparative charts that summarize the results.

Establishing detailed schedules

Programming the nodes, deployment, collecting the data.

B) Packet Error Rate The packet error rate (PER) is the number of incorrectly transferred data packets divided by the number of transferred packets. Large amount of PER may cause malfunction in the network. Testing Activities:

Formulating the assignment The objective of the testing process is to measure the PER in different topology and network utilization in the scope of the eDIANA cell level.

Global review and study On the Cell level of eDIANA IEEE 802.15.4/ZigBee compatible transceivers are used for the communication. IEEE 802.15.4: It is a standard defined for the low-cost and low-power operation in personal area networks. It defines the PHY and MAC layers. 2.4 GHz ISM band channels use O-QPSK modulation. Channel access is based on CSMA-CA (http://www.ieee802.org/15/pub/TG4.html). ZigBee: ZigBee specifications defines the network and application layers on the top of IEEE 802.15.4 layers (http://www.zigbee.org/). The Transceiver: The Freescale MC1322x family is Freescale‟s third-generation ZigBee platform which incorporates a complete, low power, 2.4 GHz radio frequency transceiver, 32-bit ARM7 core based MCU, hardware acceleration for both the IEEE 802.15.4 MAC and AES security, and a full set of MCU peripherals into a 99-pin LGA Platform-in-Package (PiP).

Page 45: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 45

Establishing the test basis Relevant eDIANA Deliverables: D2.3-A Network topology and communications architecture definition D2.3-B Communication protocol specification D3.2-C Intelligent Embedded Interface (iEi)

Determining the test strategy PER can be measured in different topologies and in different network utilization conditions

Setting up the organization Personnel responsible about the deployment and maintenance of the network

Specifying test deliverables PER in different conditions

Specifying the infrastructure A ZigBee test-bed can be deployed in an office environment in order to measure the PER. Necessary equipment: ZigBee nodes, batteries, flash programmer. There could be fixed reference nodes and mobile nodes in order to obtain different topologies. Controlled network traffic can be generated by the nodes which results with different network utilization.

Organizing management and control

Infrastructure can be deployed in an office/home environment and the measurements can be stored by the CDC.

Scheduling the test process The network should be active until statistically significant data is collected.

Controlling the test Software/Hardware problems that may occur during the network time should be controlled periodically in order to obtain reliable data.

Reporting Comparative charts that summarize the results.

Establishing detailed schedules

Programming the nodes, deployment, collecting the data.

3.2.1.2 Security Tests A) ZigBee Security The wireless medium is open to all, including potential intruders therefore ZigBee network should be protected with adequate security to prevent unauthorized devices joining the network. Testing Activities:

Formulating the assignment The objective of the this test is to verify the functionality of the authorization mechanism in the Cell level.

Global review and study ZigBee Standart Security level 5 and Home Automation Profile compability are chosen as the security configuration for the eDIANA cell level security in “D2.3-B Communication protocol specification”. The security suite 5 with the proper

Page 46: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 46

deployment of security keys ensures to prevent the information exchanged between the ZigBee nodes from being stolen (message encryption) , or tampered with (message integrity check). However when there is a new device waiting to join the network a temporary vulnerable moment occurs because of sending the network key by using a publicly known link key (Home Automation profile default trust center link key) for authorization of this new device. “D2.3-B Communication protocol specification” also defines a second level of authorization after the standard authorization procedure defined in ZigBee security which is based on E-Mode commissioning. Details can be found in Part 2.6.2 in the document.

Establishing the test basis Relevant eDIANA Deliverables:

D2.3-A Network topology and communications architecture definition D2.3-B Communication protocol specification

Determining the test strategy CDC should be able to differentiate the devices that are going to join the network in a given set of intruder and authorized devices. ZigBee routers and end devices should be tested in order to see the authorization is correctly working independent to the device type. Also devices that are more than one hop distance to the trust centre (ZigBee Coordinator) should be tested.

Setting up the organization Personnel responsible about the deployment and maintenance of the network

Specifying test deliverables Sniffer traces that are showing the message exchanges during the authorization procedure.

Specifying the infrastructure A ZigBee test-bed can be deployed in an office environment. Necessary equipment: ZigBee nodes, batteries, flash programmer, a sniffer.

Scheduling the test process The network should be active until reliable number of instances are collected.

Controlling the test The packet exchange should be traced by using a sniffer in order and keeping log of the tests to properly analyze the authorization procedure.

Reporting Sniffer traces and description of different configurations.

Establishing detailed schedules Programming the nodes, deployment, collecting the sniffer traces.

Page 47: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 47

3.2.1.3 Resources Tests In order to give insights on the transmit power requirements to have a reliable link between two ZigBee nodes receiver sensitivity of the ZigBee transceiver and path loss in a typical office environment can be measured. A) Receiver Sensitivity Tests Receiver sensitivity indicates the minimum strength of the signal that can be successfully received by the receiver. Testing Activities:

Formulating the assignment The objective of the testing process is to obtain the relationship between PER and the received signal strength.

Global review and study On the Cell level of eDIANA IEEE 802.15.4 compatible transceivers are used for the communication. IEEE 802.15.4 Standard defines receiver sensitivity as the threshold input signal power that yields a specified PER in the conditions that the packet length is 20 bytes, PER<1%, power measured at antenna terminals, and interference not present. The standard requires -85 dBm receiver sensitivity in 2.4 GHz ISM band.

Establishing the test basis In Freescale MC1322X Data Sheet receiver sensitivity is given as better than -96dBm in normal operation mode. In every packet reception receiver block in the transceiver reports a Link Quality Indication (LQI) which is calculated from several reported hardware levels. This value can be converted to the input power by using the following formula as reported in MC1322X Reference Manual. Input Power (dBm) =( LQI / 3 ) - 100 Relevant Documents:

IEEE 802.15.4-2006 Standard MC1322X Reference Manual MC1322X Data Sheet

Determining the test strategy In a controlled environment where there is no significant interferer PER and LQI can be measured in different distances between receiver and transceiver while transmitter is emitting packets at a fixed transmit power. The distance between the devices changes the input power on the receiver.

Setting up the organization Personnel responsible about the deployment and maintenance of the network

Page 48: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 48

Specifying test deliverables A chart that shows the relationship between PER and LQI which highlights the receiver sensitivity.

Specifying the infrastructure A receiver and transmitter device can be located in an environment where there is no significant interferer in the 2.4 GHz ISM band. The transmitter can send statistically significant number of packets to the receiver then receiver can calculate the PER and average LQI in order to obtain the relationship between them.

Organizing management and control

The transceivers can be deployed in a public garden far away from the wi-fi access points working on 2.4GHz ISM band.

Scheduling the test process The measurements should last until statistically significant data is collected.

Controlling the test Software/Hardware problems that may occur during the measurements devices should be controlled periodically in order to obtain reliable data. In order to have meaningful data the LQI value shouldn‟t oscillate significantly during the measurements the LQI on the receiver side should be controlled during the measurement. People walking around may cause LQI oscillations on the receiver so the environment around the transceivers should be still there shouldn‟t be moving objects.

Reporting A chart that shows the relationship between PER and LQI which highlights the receiver sensitivity. A discussion on the results

Establishing detailed schedules Programming the nodes, deployment, collecting the data

B) Path Loss Tests Path loss is the attenuation of an electromagnetic wave as it propagates through space. The attenuation is different in different media so could be interesting to measure the path loss in a typical office environment and to compare these values with the chart that will come from the receiver sensitivity tests. Testing Activities:

Formulating the assignment The objective of the testing process is to collect the instances of the path loss in an office environment in order to gain some insights on the connectivity between ZigBee nodes.

Global review and study Path loss can be modelled simply as; L = 10 n log10(d) + C where L is the path loss in decibel, n is the path

Page 49: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 49

loss exponent, d is the distance between transmitter and the receiver, and C is a constant accounts for the system losses and obstacles between transceivers.

Establishing the test basis In every packet reception receiver block in the transceiver reports a Link Quality Indication (LQI) which is calculated from several reported hardware levels. This value can be converted to the input power by using the following formula as reported in MC1322X Reference Manual. Input Power (dBm) =( LQI / 3 ) - 100 Relevant Documents:

IEEE 802.15.4-2006 Standard MC1322X Reference Manual MC1322X Data Sheet

Determining the test strategy In an office environment LQI can be measured while the transceivers are in different locations. The distance between the devices and the obstacles between them changes the input power on the receiver because of the different path loss.

Setting up the organization Personnel responsible about the deployment and maintenance of the network

Specifying test deliverables Charts showing the LQI values on the receiver side while the transmitters are in different positions.

Specifying the infrastructure A receiver and transmitter device can be located in an office environment. The transmitter can send statistically significant number of packets to the receiver then receiver can calculate the average LQI.

Organizing management and control

The transceivers can be deployed in a public garden far away from the wi-fi access points working on 2.4GHz ISM band.

Scheduling the test process The measurements should last until statistically significant data is collected.

Controlling the test Software/Hardware problems that may occur during the measurements devices should be controlled periodically in order to obtain reliable data.

Reporting Charts showing the LQI values on the receiver side while the transmitters are in different positions. A discussion on the results

Establishing detailed schedules Programming the nodes, deployment, collecting the data

Page 50: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 50

4. Graphical User Interface and User Accessibility Tests The eDIANA GUI is the gate for the user to receive feedback from the system regarding energy consumption. In eDIANA, both a mobile and a television UI are developed and integrated. These demo UI‟s have to be tested with real users to see how eDIANA can persuade the user to reduce his energy consumption. The main objective of this user test is identifying how eDIANA can have an impact on the user behaviour regarding his energy management. Furthermore, there is an interest to identify the user‟s impressions of the eDIANA system. The test results will be considered within the context of the WP8 demo.

4.1 GUI Test Goal The intention is to test the GUI on three different domains:

1. Navigation through the GUI (accessibility) The main purpose is to find out whether the user is able to access all the different information that the eDIANA web service can provide. Is the user able to generate a graph or to access specific advices? Specific tasks will be assigned to the user to check if he is able to perform it successfully.

2. Understanding of the information (understanding)

It is important to validate the user understanding of the information, displayed on the eDIANA web site. Examples of subjects are the interpretation of the graphs, sections, speed meter, etcetera.

3. User acceptance (satisfaction)

The third domain is rather abstract. This domain aims to identify the users attitude towards the eDIANA GUI. Will the user make use of the system regularly? What are the pros and the cons of the GUI?

4.2 GUI Test Methodology Two similar qualitative user tests are considered to find meaningful insights regarding the three described domains: one for TV and one for smart phone. The participants will be interviewed in three different steps, each related a specific domain. First, the user can explore the GUI by freely navigating through it. The interviewer will ask specific questions regarding the navigation. After that the user is asked to perform certain tasks. The interviewer will ask specific questions regarding understanding. After that, the participant will be asked to complete a small questionnaire about the eDIANA GUI acceptance.

Page 51: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 51

4.2.1 Participants

12 participants will be confronted with the eDIANA GUI, 6 for the internet connected television and 6 for the smart phone. For each experiment, a variety of participants will be considered. The aim is to recruit participants with an age between 20 and 50, both male and female. These profiles can be considered as the eDIANA target group. For both tests, participants will be screened to find out whether they have experience with the considered devices already. Only participants with Net TV or smart phone experience are allowed to participate. Otherwise the test is rather focusing on the device than on the application itself.

4.2.2 Informed consent

All the participants have to sign a consent form. By signing, the participant indicates that he is aware about the test methodology and test goals. Their data has to remain confidential, and will not be used outside of the eDIANA project. The consent form is available in annex I.

4.2.3 Test facilities and materials

Two different locations are considered for the user test: one for the TV GUI and one for the mobile GUI. The TV GUI will be tested in the Philips Consumer Lifestyle Living Room Labs, Netherlands. These labs are fully equipped living rooms with a state-of-the-art TV installed. This will give the participant a realistic in home experience. The TV, equipped with Net TV, will include the eDIANA web service. The participants will sit between 2 and 3 meters away from the TV screen. This is an average viewing distance. The, Mobile GUI will be tested in a complete office environment within ACCIONA‟s R&D department building in Spain. Different users, covering different profiles, will be provided smartphone with eDIANA mobile UI for the experimental phase. These smart phones will run over different operative systems and will have access to the interface through 3G and Wifi. Other materials that will be used for this test:

Observation forms to aid the evaluator during the study. This form includes the whole test procedure and space for notes is available. One form is needed per participant.

One consent form per participant (see section 4.2.2).

Page 52: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 52

4.2.4 Measurements

The test will be approached mainly in a qualitative way, since the main goal of the test is gathering information regarding the participants‟ attitude towards the UI. Also, a qualitative test is more suitable, since there is no comparison against another UI. During the whole test, the user will be asked to “think aloud”. Thinking aloud is a specific method for user testing. The user always tells the researcher what he is doing/thinking during the test. E.g. “I am about to press the middle button. Where should it bring me?” The researcher will only take notes. Audio and video recording will not be considered, since this can give the participant an uncomfortable feeling. Also, for the TV UI test, only two persons are allowed in the testing facility: the researcher and participant. For the smartphone test, persons might be around, since this reflects a real scenario more. However, the persons may not interrupt the participant or researcher during the test.

4.2.5 Analysis

The interview will result in the next data set:

Notes, taken by the researcher during the whole test

Questionnaires, completed by the participants.

Conclusions, made by the researcher after the evaluation

Obviously, the gathered qualitative data will be the most important for the researcher to answer the navigation and understanding test. For the satisfaction test, the participant‟s rating will be considered.

4.2.6 Results

The evaluation of the UI will results in one report for each display device, summarizing the main findings, within the context of the three test domains. The report will mainly include the following:

Conclusions regarding accessibility

o Problem areas regarding the navigation

o Impression of the general acceptance of navigating.

o Suggestions for improvement

o Discussion regarding the findings

Page 53: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 53

Conclusions regarding understanding

o Problem areas regarding the navigation

o Well understood versus less understood sections

o Suggestions for improvement

o Discussion regarding the findings

Conclusions regarding satisfaction

o Problem areas regarding the navigation

o Impression of the general acceptance of navigating.

o Suggestions for improvement

o Discussion regarding the findings

The different reports and findings will be included in the T7.2 deliverable. If there is time and budget within this task, minor modifications on the UI will be made, based on the test findings.

4.3 Test Scenario

For both tests, a similar test scenario is considered. However, there will be some essential differences regarding the questions that the interviewer will ask. These questions are device specific questions. E.g. for Net TV some questions might be asked regarding the remote control, while the smart phone might include questions on touch control. The test scenario is divided in three different sections: Pre-test, actual test and debriefing.

4.3.1 Pre-test

First, the participant will be welcomed and introduced to the eDIANA project (before that, the participant is screened about his Net TV using experience). More particular, the GUI and the objectives of it will be explained more detailed.

Page 54: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 54

Furthermore, the participant will be explained about the test procedure, objectives and duration. After informing the participant, he will be asked to sign the consent form (see attachment A). Once this procedure is finished, the actual test can start.

4.3.2 Actual test

In the actual test, the three domains should always be considered in the same order. It is essential to consider accessibility first, since otherwise the user might be biased by navigating through the UI during the other test domains. After the, information understanding is reviewed. Last, the user satisfaction is considered.

4.3.2.1 Accessibility

In the very beginning the user sees the home screen of eDIANA. This is where the actual review will start. To have an impression about how the user is navigating through the icons, he will be asked to perform several specific tasks. There is a specific interest in whether the user is capable to:

Navigate through the interface by using the RC (task 1A - Only for the TV UI)

Navigate through the interface by using touch screen (task 1B - Only for the smartphone UI)

Generate a graph (task 2)

Check the consumption of a specific device, that is not visible at the first time in the device section (task 3)

Page 55: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 55

Task 1a: Take the remote control and explore the UI freely.

Start: Home screen

Goal: explore the UI. Test whether the user is able to navigate through the application

in an intuitive way.

Notes

Keep on talking, please.

What are you looking at right now?

Was that easy to do?

Can you explain the structure of the UI?

Please explore all the section

What do you think that button does?

Do you think it was rather difficult or rather easy? Why?

Were you able to complete the task? Yes / No

Page 56: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 56

Task 1b: Take the smart phone, enter URL in web browser and explore the UI freely.

Start: Home screen

Goal: explore the UI. Test whether the user is able to navigate through the application

in an intuitive way.

Notes

Keep on talking, please.

What are you looking at right now?

Was that easy to do?

Can you explain the structure of the UI?

Please explore all sections

What do you think that button does?

Do you think it was rather difficult or rather easy? Why?

Were you able to complete the task? Yes / No

Page 57: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 57

Task 2: generate a graph to display your energy consumption from XXX until XXX (a time frame will be defined, just before the test to make it more realistic).

Start: graph section

Goal: Identify how the user generates a specific graph manually.

Notes

Keep on talking, please.

What are you looking at right now?

Was that easy to do?

Do you see what you have to do?

Do you see what you need?

What do you think that button does?

Do you think it was rather difficult or rather easy? Why?

Were you able to complete the task? Yes / No

Page 58: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 58

Task 3: Check the current energy consumption of your dishwasher.

Start: devices section

Goal: Overview of energy consuming devices

Notes

Keep on talking, please.

What are you looking at right now?

Was that easy to do?

Do you see what you have to do?

Do you see what you need?

What do you think that button does?

Do you think it was rather difficult or rather easy? Why?

Were you able to complete the task? Yes / No

4.3.2.2 Understanding

It is essential to know how the eDIANA user interprets the displayed information. We are wondering how users perceive the eDIANA energy management information. For this reason, users will be asked to interact with the different sections and have to tell what they see. There is a specific interest in next sections.

Understanding of the „speed meter‟ (task 4)

Understanding of the total and average monthly use (task 5)

Understanding of the agenda section (task 6)

Understanding the device section (task 7)

Page 59: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 59

Task 4: check the current total energy consumption.

Start: Home screen

Goal: being able to tell what the current energy consumption is and to have an idea

about the meaning of that value (Euro and Watt).

Notes

Keep on talking, please.

What are you looking at right now?

Can you explain the structure of the UI?

Can you also display the consumption in another value?

What do you think n Watt means?

Do you think it was rather difficult or rather easy? Why?

Were you able to complete the task? Yes / No

Page 60: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 60

Task 5: check the monthly use so far and the average consumption this month.

Start: Home screen

Goal: being able to tell what the total and average energy consumption is and to have

an idea about the meaning of it.

Notes

Keep on talking, please.

What are you looking at right now?

Can you explain the structure of the UI?

What do you think about the average energy consumption when you see that value.

What do you think n Watt means?

Do you think it was rather difficult or rather easy? Why?

Were you able to complete the task? Yes / No

Page 61: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 61

Task 6: Browse the agenda section and what the total and average energy consumption is and to have an idea about the meaning of it.

Start: Agenda section

Goal: being able to tell what the total and average energy consumption is and to have

an idea about the meaning of it.

Notes

Keep on talking, please.

What are you looking at right now?

Can you explain the structure of the UI?

What do you think about the average energy consumption when you see that value.

What do you think n Watt means?

Do you think it was rather difficult or rather easy? Why?

Were you able to complete the task? Yes / No

Page 62: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 62

Task 7: Go to your dishwasher and tell about the status of it.

Start: device section

Goal: being able to understand the feedback regarding the energy consumption of a

specific device

Notes

Keep on talking, please.

What are you looking at right now?

Can you explain the structure of the UI?

What do you see more?

Do you think it was rather difficult or rather easy? Why?

Were you able to complete the task? Yes / No

4.3.2.3 Satisfaction

To measure the satisfaction, another approach is considered. Instead of giving a specific task the user will be asked to complete a brief questionnaire. Before that, the participant is given some more time to explore the UI if he wants. The questionnaire also includes more specific questions, related to the debriefing. The questionnaire can be found in Appendix B.

4.3.3 Debriefing

The debriefing considers the third test domain (satisfaction). The user is given a small questionnaire on paper and is asked to write down his impression regarding the user interface. This post-test questionnaire should gather insights about users‟ experience with the interface along with their preferences for any alterations. A standard debriefing questionnaire is given within Annex B.

Page 63: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 63

5. Energy Efficiency Optimization (with information from the IP camera) This section refers to the energy efficiency optimization module; however, as it has not specified in any other work packages or tasks, this section will differ from previous ones. This section will describe the functionalities, design and implementation besides the test methodology that will be implemented to verify and validate this energy efficiency optimization module. The energy efficiency optimization module will take advantage of the information that is stored in the CDC Data base, information provided by the cameras and occupancy sensors. Next sections will explain the considerations while developing this module, the functionalities that will be implemented, besides the design and the kind of tests that will applied to verify and validate the module.

5.1 Overview - Considerations about the Energy Efficiency Optimization The energy efficiency module will be defined as independent as it can be. It will act depending on the information the devices and other modules have stored in the CDC Database. This decision has been taken so that its interaction with the rest of the modules can be reduced and the existence of others devices does not affect its functionality. The CDC provides several java classes that define its behaviour, as Figure 17 shows. There are modules or classes that interact with the database, some modules that depending of the information of the database estimate if the algorithm that balances the load must be executed, others checks the validity of the stored data, etc.

Page 64: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 64

Figure 17: CDC Modules

Some modules are called periodically; the energy efficiency optimization module will be design to be called in the same way. It will be executed periodically after other algorithms. The algorithm must support inconsistencies of the data stored in the Database regarding to data types, maximum and minimum values, etc. It will not be aware of which is the frequency the data is registered with, and it will attend to the timestamps the registers have. The algorithm must be tolerable to Database breakdowns, or inexistence of information. The absence of information will degenerate into a less efficiency of its functionality, but not into a system breakdown. As this module relies on the Database information, but the identification of the devices in the database is not the same of those obtained through the ZigBee communications. The correspondence of this identification is made in the Scheduler, thus a new functionality has been added to this module. The Scheduler must to be able of provide an identification match between identifications of the Database and ZigBee communications. The definition of this algorithm needs some little modifications of the CDC Database, as it relies on the information provided by the IP cameras, the algorithm must be able to identify the devices that are in the same area of the IP camera, in order to act on them. It has been added some fields that identify the zones or areas in the next tables:

Page 65: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 65

TCell

o Field: NumZones

o Type: Int

o Description: number of zones of the cell, each zone must have at least one ip camera or an occupancy sensor.

MyDevices

o Field: idZone

o Type: TINYINT

o Description: identifier of the zone in which the device is located

IpCamera

o Field: idZone

o Type: TINYINT

o Description: identifier of the zone that the ipcamera controls

OccupancySensor

o Field: idZone

o Type: TINYINT

o Description: identifier of the zone that the occupancy sensor controls

The possible actuation on devices will be made through the same mechanism the Scheduler uses to control the devices when it acts because of load balance algorithm. It is this why the correlation of communication identifiers and Database identifiers will be provided by the Scheduler.

5.1.1 Functionalities of Energy Efficiency Optimization. The goal of this algorithm is to save extra energy using the information of the CDC to determine whether any person is currently in the Cell and acting over non programmable loads that can be cut.

Page 66: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 66

The algorithm will use the information from:

IP Cameras (person count and movement detection)

Presence Sensors

Dummy Loads

The first design of the algorithm considered the number of persons present in the zone of the cell, but as the algorithm will have a frequency of minutes, and the information of the number of persons will be recorder once a day, the use of this information is not valid to state if there is the person in the time the algorithm is executed. Here, it will be explained as if the number of persons timestamp included the exact time and not only the date. As the algorithm is defined to work correctly in case of lack of information, in real execution there will be only four states of the table of truth that can happen. As a result of the algorithm execution the CDC will determine whether any person is in the Cell and whether any strange situation is detected:

The person count in the Cell is > 0, but no presence is detected, or

Person count is 0, but some presence is detected in the Cell.

This algorithm proposes to include a new identifier (zone or group identifier) available to every device, sensor, and ipcamera in CDC database. This identifier will provide information of the zone that the ipcamera manages, as well as the identification of every device in the same zone. The CDC cell table will include a new attribute that will specify the number of zones of the cell. Per Cell zone (if applicable), the algorithm will calculate the following variables:

Movement (boolean). Has movement been detected by any IP cameras / presence sensors in the Cell zone.

Person count (int). The total number of people counted in the zone by an associated IP camera. If this value is not available or if there is an error in the count -1 is returned.

Load change in the zone in the past 1 hour (boolean). We‟ll check the consumption of all the non-programmable loads in the Cell for the past 1 hour. A true value shall be returned if a significant load change (>THRESHOLD) is detected in the algorithm period, false otherwise.

Page 67: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 67

Action (int) and Error (enum) will be calculated according to the following truth table.

Movement Load Change Person Count Action Error

0 0 0 Off No

0 0 1 Delayed_Off No

0 0 -1 Delayed_Off Count_Error

0 1 0 Off Unexpected_Load

0 1 1 Enable No

0 1 -1 Enable Count_Error

1 0 0 Delayed_Off Unexpected_Movement

1 0 1 Enable No

1 0 -1 Enable Count_Error

1 1 0 Delayed_Off Count_Error

1 1 1 Enable No

1 1 -1 Enable Count_Error

*Enable action implies:

Remain all devices unchanged.

Enable plugs.

*Delayed_Off action implies:

The algorithm will wait x time in order to see if the status has changed, if not, it turn off the non-programmable loads

*Off action implies:

The algorithm will drastically turn off all non-programmable devices.

5.1.2 Design of the Energy Efficiency Optimization module The algorithm is defined by two java classes inside package es.tecnalia.eDIANA.CDC.eeAlgorithm (this software classes with be released as part of the results of WP7):

CDCDBAcceess

Algorithm

The CDCDBAccess provides all the functionality regarding to the access to the CDC Database, besides the logic that define the variables that are input to the table of truth:

Page 68: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 68

isMovement: it returns true when movement is detected in the zone.

o Used tables: ICMovementDetection, ipCamera and OccupancySensor

IsAnyone (deprecated until the ipcamera provides date and time): it returns:

o 1 if there is somebody in the zone

o -1 if there is a conflict with the number of persons out

o 0 if there is noone in the zone

o Used tables: PersonsCounting and ipCamera

IsLoadChange: it returns true if there is a change in the consumption of devices in table myDevices. It stores last iteration consumptions in a list and compares with the declared consumptions in the table myDevices

o Used tables: myDevices

The functionality to access the database is provided by next functions:

connectDB: makes the connection to the database identified in the parameters

makeQuery: makes the query (string parameter) to the database

makeUpdate: makes the update (string parameter)

closeConnection: close the connection to the database

Some other functionalities are available in this class:

DevicesToTurnOff: The function selects all the devices of the zone that are connected and status > 0 and have a not null consumption

o Used table: myDevices

TurnOffInDB: sets the fields devicecurrenteusekw, activationtime and switchenable to the values of a switched off device

o Used table: myDevices

5.2 Test Methodology The IP camera will contribute to optimize the energy efficiency control. The IP camera is integrated, as any other device, at the cell level of eDIANA architecture.

Page 69: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 69

Because of the IP camera is a quite complex device, the connection between the camera and the Cell Device Concentrator, CDC, has been done directly and not through the iEi element being developed in this project. As it has been explained in previous sections, the IP camera will provide some information to the CDC and, this information, will eventually allow a better energy efficiency management. In order to deal with the best way to validate the whole “subsystem”, formed by the IP camera, the CDC, the connection between both devices and, above these elements, the final energy efficiency functionality of eDIANA architecture, four different test levels have been defined.

Level 1. Firmware of the IP camera. Level 2. Communication between IP camera – SPEAr600 Level 3. Data stored into DB SPEAr600 Level 4. Energy efficiency management.

5.2.1 Level 1. Firmware of the IP camera The IP camera firmware incorporates several algorithms to calculate some variables which will be sent to the CDC and which should be used into the energy efficiency management. The obtained variables by the applications running onto the IP camera are:

Persons counting At this level, some tests allowing the validation of the correct functioning of the device algorithms are defined.

5.2.1.1 Level 1. Validation platform Here, some description of the platform on which the test will be run. Eventually, this platform will consist of the own IP camera connected to a PC in which its development framework will be running.

Page 70: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 70

Figure 14. Monitoring IP camera.

5.2.1.2 Level 1. Tests The test identified will be codified in a simple way: “N” + 1 (LEVEL) + “.” + two digits (sequential number)

N1.01 Persons counting. Verify the correct behaviour of the algorithm: adds 1 when one person enters into the IP camera control area and removes 1 when one person comes out from the IP camera control area.

N1.02 Persons counting. Verify the correct behaviour of the algorithm: when a group of persons enters, at the same time, into the IP camera control area and when a group of persons come out, at the same time, from the IP camera control area.

N1.03 Persons counting. Verify the correct behaviour of the algorithm: if the IP camera resets, the persons counter maintains correctly the number of persons stored. Obviously, if the IP camera is switched off when someone comes through, the counter will not be updated.

N1.04 Persons counting. Verify the correct behaviour of the algorithm: when the number of persons coming out is greater than the number of persons coming in; this is, at the end of the day, the counter is negative. N1.05 Persons counting. Verify the reset of the persons counter at the default programmed time, 11:59 pm.

N1.06 Movement detection. Verify the correct behaviour of the algorithm: if one or several elements change their position inside the controlled IP camera area, an event is triggered. This event has two associated images: the image before the movement and the image after the movement.

Page 71: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 71

5.2.2 Level 2. Communication between IP camera – SPEAr600 As it has been described, the IP camera is directly connected to the CDC, in this case, the SPEAr600 system, through the ONVIF protocol on an IEEE 802.11g communication network. This protocol is based in WEB services technology. The protocol architecture in this specific platform is the following: IP camera CDC spear600 ONVIF Server ONVIF client In order to check the communication between evaluation board SPEAr600 and an AXIS IP camera, this main program will save two images and the relevant info of them into a device disk.

First of all, it will connect by using a socket (connectVideoStream). After this, we

realized a loop to save two images on the device disk. This is done with the function

getFrameFromVideoStream.

Once we have the frame, we will save it to an image jpeg and we will extract the relevant info. At the end, we will close the socket. The next figure explains graphically this program.

Figure 15. Main program diagram

Page 72: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 72

5.2.2.1 Level 2. Validation platform The validation platform consists in an AXIS P3301 IP camera and SPEAr600 system, as CDC.

Figure 16. Connection between IP camera device and the cell CDC (in this case, SPEAr600 system)

5.2.2.2 Level 2. Tests The test identified will be codified in a simple way: “N” + 2 (LEVEL) + “.” + two digits (sequential number)

N2.01 Communication between IP camera and SPEAr600. Verify the correct behaviour of the communication between both devices: if the communication is broken, the camera will try to reconnect X times (or forever…).

N2.02 Communication between IP camera and SPEAr600. Verify the correct behaviour of the communication between both devices: the communication will be monitored during 96 hours in order to determine:

o Number of lost of connections o Duration of each broken connection. o Number of troubles in communication: abnormal delay on sending

frames, lacks of information, etc. o Evaluation of the availability and performance of the communication.

5.2.3 Level 3. Data stored into DB of SPEAr600 system The data, variables, provided by the AXIS IP camera to the CDC will be stored into several registers of the MySQL SPEAr600 database.

Page 73: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 73

The structure of the database corresponding to this information is the following:

Figure 17. DB tables related to AXIS IP camera information.

5.2.3.1 Level 3. Validation platform To deal with the tests corresponding to this level, the above described platform will be used. In addition, an interface / tool to the MySQL database running on SPEAr600 will be used in order to validate all the values of both, persons counting and movement events variables, and their different attributes.

5.2.3.2 Level 3. Tests The test identified will be codified in a simple way: “N” + 3 (LEVEL) + “.” + two digits (sequential number)

N3.01 ICPersonsCounting. Verify the correct behaviour of the MySQL database referring to IP camera information. One new register will be created daily. This register will contain several attributes:

o ID camera. Identifier of the camera. o Corresponding date. o Number of persons who had entered in the camera controlled area

during a day. o Number of persons who had came out from the camera controlled area

during a day. o Number of persons inside the camera controlled area at the end of the

day. The correct values of these attributes will be verified.

Page 74: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 74

N3.02 ICMovementDetection. Verify the correct behaviour of the MySQL database referring to IP camera information. A new register will be created each time a movement inside the camera controlled area is generated.

o ID camera. Identifier of the camera. o Corresponding time and date. o ID alarm. o Description of the alarm or event.

The correct values of these attributes will be verified.

5.2.4 Level 4. Energy efficiency control functionality This section will describe the test that will be executed to verify and validate the functionality of the energy efficiency optimization. As the module has been designed to work against information obtained from the CDC Database, the tests will check next issues:

The system does not fail if the connection to the database is unavailable The queries to the database are empty The values obtained in queries are valid

Each issue will be checked independent of the rest; all the functions that interact with the database will be checked as if they were black boxes. For all of them, the database will modify to cover the above mentioned issues.

5.2.4.1 Level 4. Validation Platform The tests that will check the Energy Efficiency Optimization module will be made in a pc platform, since the used language to build the module is java, a cross-platform language, thus, the translation to the final platform can be inmediate. The tests to check the functionality of the algorithm will be made filling the Database with testing information that simulates the activity generated in the cell. For example, next SQL sentences will simulate one person waking up in a house, in zone 5 (main bedroom) (see Figure 13), how he gets ready, goes to the kitchen to have breakfast before going out.

INSERT INTO `cdcDB`.`ICMovementDetection` (`idipCameraMovementDetection`, `ipCamera_idipCamera`, `Day`, `IdCamera`, `IdAlarma`, `DescAlarma`) VALUES (1, 1, '2011-03-01 07:50:20', 1, 1, ''), (2, 1, '2011-03-01 07:52:20', 1, 1, ''), (3, 1, '2011-03-01 07:54:20', 1, 1, ''), (4, 1, '2011-03-01 07:56:20', 1, 0, ''), (5, 1, '2011-03-01 07:58:20', 1, 0, ''), (6, 5, '2011-03-01 08:00:00', 5, 1, ''), (7, 5, '2011-03-01 08:02:00', 5, 0, ''),

Page 75: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 75

(8, 5, '2011-03-01 08:04:00', 5, 0, ''), (9, 5, '2011-03-01 08:06:00', 5, 0, ''), (10, 5, '2011-03-01 08:08:00', 5, 0, ''), (11, 1, '2011-03-01 08:10:00', 5, 0, ''); INSERT INTO `cdcDB`.`ICPersonsCounting` (`idipCameraPersonsCounting`,`ipCamera_idipCamera`,`Day`, `IdCamera`, `InputsPersons`,`OutPutsPersons`,`PersonsInTheBuilding`,`idzone`) VALUES (0, 0, '2011-03-01 00:10:00', 1, 1, 0, 1,0), (1, 0, '2011-03-01 08:14:20', 1, 0, 1, 0,0), (2, 1, '2011-03-01 08:15:00', 1, 1, 0, 1,1), (3, 1, '2011-03-01 08:35:00', 1, 0, 1, 0,1); insert into `cdcDB`.`mydevices` values (1,1,'socket zone 0',1,1,'none',0,0,'2011-03-01 00:00:00',0,'none','2011-03-02 00:00:00',0,0), (2,1,'lighting zone 0',1,1,'none',0,0,'2011-03-01 00:00:00',0, 'none','2011-03-02 00:00:00',0,0), (3,1,'home media',1,1,'none',0,0,'2011-03-01 00:00:00',0,'none', '2011-03-02 00:00:00',0,0), (4,1,'lighting zone 1',1,1,'none',0,0,'2011-03-01 00:00:00',0,'none','2011-03-02 00:00:00',0,1), (5,1,'tv zone 1',1,1,'none',0,0,'2011-03-01 00:00:00',0,'none','2011-03-02 00:00:00',0,1), (6,1,'lighting zone 2',1,1,'none',0,0,'2011-03-01 00:00:00',0,'none','2011-03-02 00:00:00',0,2), (7,1,'socket zone 3',1,1,'none',0,0,'2011-03-01 00:00:00',0,'none','2011-03-02 00:00:00',0,3), (8,1,'lighting zone 3',1,1,'none',0,0,'2011-03-01 00:00:00',0,'none','2011-03-02 00:00:00',0,3), (9,1,'socket zone 4',1,1,'none',0,0,'2011-03-01 00:00:00',0,'none','2011-03-02 00:00:00',0,4), (10,1,'lighting zone 4',1,1,'none',0,0,'2011-03-01 00:00:00',0,'none','2011-03-02 00:00:00',0,4), (11,1,'socket zone 5',1,1,'none',0,0,'2011-03-01 00:00:00',0,'none','2011-03-02 00:00:00',0,5), (12,1,'ligthing zone 5',1,1,'none',0,0,'2011-03-01 00:00:00',0,'none','2011-03-02 00:00:00',0,5), (13,1,'fridge',1,1,'none',0,0,'2011-03-01 00:00:00',0,'none','2011-03-02 00:00:00',0,5), (14,1,'microwave',1,1,'none',0,0,'2011-03-01 00:00:00',0,'none','2011-03-02 00:00:00',0,5), (15,1,'oven',1,1,'none',0,0,'2011-03-01 00:00:00',0,'none','2011-03-02 00:00:00',0,5),

Page 76: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 76

(16,1,'dish washer',1,1,'none',0,0,'2011-03-01 00:00:00',0,'none','2011-03-02 00:00:00',0,5), (17,1,'washing machine',1,1,'none',0,0,'2011-03-01 00:00:00',0,'none','2011-03-02 00:00:00',0,5), (18,1,'socket zone 5',1,1,'none',0,0,'2011-03-01 00:00:00',0,'none','2011-03-02 00:00:00',0,5);

Figure 18 Simulation of a house, and its configuration in zones

The information of the Database must be modified to simulate different activities in the cell, in order to check that the algorithm increase the energy efficiency in the cell. The algorithm should turn off all the devices that it state that are not delayed by the algorithm that controls the balance of load, when it establishes that there is not use of this device. To facilitate the execution of this tests, there has been implemented an editor of eDIANA Cell, based on Eclipse GMF. This editor allows the definition of a cell in a building attending to all the elements that can be stored in the CDC Database. This editor will provide a connection to the database, in a way that each device that is defined in the editor as belonging to the cell, will be create in the CDC Database through a SQL sentence. Next pictures shows how a configuration of a house represented in a plan could be represented in the tool.

Page 77: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 77

Figure 19 Cell Configuration tool

Next improvement to the editor will allow modifying the database through properties of the elements, in order to simulate the activity of the cell and provide inputs that will help to test the functionalities of the algorithm. Those inputs will alter the consumptions of the devices in the Database, the occupancy and movement detection of the zones, simulating a real use of the devices of the cell.

Page 78: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 78

5.2.4.2 Level 4. Tests

Table 5 Database interaction tests

Regarding the software, all the functions shall fulfil the following requirements: The return values of each function must be checked when it is called If the function returns a value, all the execution pads should return a valid

value.

Table 6 software tests

Function Test description Applicable functions

Database unavailable

Alter the name, user and password of the database, and execute the function

connectDB

Empty queries

Empty the database or the queried tables and check the function returns a consistent value and does not block the algorithm

makeQuery isMovement isAnyone getNumZones isLoadchange setNewConsumptions getRSHighestId DevicesToTurnOff TurnOffInDB getListIdmyDevices

Invalid values

Types of queried data will be checked through the SQL sentence itself.

makeQuery makeUpdate isMovement isAnyone getNumZones isLoadchange setNewConsumptions getRSHighestId DevicesToTurnOff TurnOffInDB getListIdmyDevices

Function Test description Applicable functions

Return values

Visual check of all the functions call hierarchy and ensure that validity of the returned values

All functions

Functions exits

Adapt input parameters, and Database status and information to cover all the execution pads of the functions

All functions

Page 79: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 79

Regarding the implemented functionality:

There will not be deadlock states The states defined in the table of truth will execute the correspondent actions Inaccessible states will be managed in order not to block the algorithm

Lack of information from the database will not derive to a malfunction of the system, the only effect will be no action

Table 7 Functionality tests

Function Test description Applicable functions

Deadlock states

Execution of all possible states of the table of truth, varying the values of the database and simulating different actions of the user in the cell and checking that the order to turn off the right devices is made.

Algorithm

Actions For each possible state, check that the correspondent action is executed, independent of the zone and device

Algorithm

Inaccesible states

Simulate incongruous data in the database, and check that there are not inaccessible states. Check that there is not malfunction of the system, only a degradation of the functionality is accepted.

Algorithm

Page 80: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 80

Conclusions

This deliverable has described the test methodologies and the tests to be carry out in order to validate and demonstrate the embedded technologies defined for the DIANA platform. Two specific down-scale demonstrators have been defined in order to test and validate the cell load management and the communication technologies. The validation of the graphical user interface has been also developped, in order to check the user accessibility. A new functionality has been added, which is the optimization of the energy efficiency, developing a specific module for this optimization and making use of the information coming from IP Cameras. The tests defined in this deliverable will be carried out in Task T7.2, and results will be immediately applied to WP8 where real-scale demonstrator has to be defined.

Page 81: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 81

Acknowledgements The eDIANA Consortium would like to acknowledge the financial support of the European Commission and National Public Authorities from Spain, Netherlands, Germany, Finland and Italy under the ARTEMIS Joint Technology Initiative.

References [1] Justyna Zander-Nowicka, Model-based Testing of Real-Time Embedded Systems

in the Automotive Domain, phD Thesis, Technischen Universität Berlin, Berlin, 2009

[2] Hesham Shokry and Mike Hinchey, Model-Based Verification of Embedded Software, IEEE Computer, vol. 42, no. 4, pp. 53-59, April, 2009

[3] Bart Broekman and Edwin Notenboom, Testing Embedded Software, Addison-Wesley, 2003

[4] Frank Büchner, Using the classification tree method, Embedded Systems (Europe) June 2002

Page 82: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 82

Appendix A: UI informed Consent form

Page 83: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 83

Appendix B: Debriefing Questionnaire eDIANA User Interface -------------------------------------------------------------------------------------------------- Participant :_____________________________________________ Date : ____/____/_______ -------------------------------------------------------------------------------------------------- 1 ) The way the system allows you to perform the tasks given is Confusing 1 2 3 4 5 Relevant 2) Navigation through the system is Confusing 1 2 3 4 5 Relevant 3) The appearance of the interface is Confusing 1 2 3 4 5 Relevant 4) Icons clearly represent their functions Confusing 1 2 3 4 5 Relevant 5) Structure and organization of the interface is Confusing 1 2 3 4 5 Relevant 6) Are names and descriptions appearing in the interface clear? Not clear 1 2 3 4 5 Very clear 7) Has it been easy for you to perform the tasks assigned? Very easy 1 2 3 4 5 Very difficult 8) Do you think the interface is appropriate for any kind of user? Not at all 1 2 3 4 5 Yes, anyone could use it 9) Interface’s simplicity and ease of use in general terms A mess 1 2 3 4 5 Really simple 10 ) I think I’m going to need someone to help me handle this interface Indeed 1 2 3 4 5 Not at all 11 ) I think this interface would be very useful for me Sure it will 1 2 3 4 5 Sure it won’t Please, answer the following questions about yourself:

Page 84: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 84

15 ) Personal Information Sex : M F Current situation: Studying Working Other: _________________________ Age : _________

Page 85: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 85

Appendix C: CDC Performance

Performance Test of CDC (CPU Utilization) Prerequisites for doing performance testing of the CDC: Fedora14 VM with STLinux 2.3 installed Spear600 evaluation board The GNU / Linux environment on this virtual machine (Fedora + STLinux 2.3) contains the basic tools to test the performance of the CDC. In general it's possible to use common software on the target filesystem which is used as "root filesystem" running on the board. Top:*The top program provides a dynamic real-time view of a running system. It can display system summary information as well as a list of tasks currently being managed by the Linux kernel. The types of system summary information shown and the types, order and size of information displayed for tasks are all user configurable and that configuration can be made persistent across restarts. The program provides a limited interactive interface for process manipulation as well as a much more extensive interface for personal configuration -- encompassing every aspect of its operation. And while top is referred to throughout this document, you are free to name the program anything you wish. That new name, possibly an alias, will then be reflected on top's display and used when reading and writing a configuration file. ”

Page 86: D7.1-A Demo Lab Testing Methodologys15723044.onlinehome-server.info/artemise/... · 0.3 2011-03-28 UNIBO Draft version 0.3 2011-04-11 MU Reviewed by FAGOR and IKERLAN 0.3 2011-05-06

Demo-Lab Testing Methodology

eDIANA: GA no.: 100012

D7.1-A

May 2011 Page 86

Ps: Command for standard terminal that show process that are active in the OS. Use this to confirm the data, about CPU and memory resource used by any single process.

Free: Show memory consumption of system. Use this only to confirm the data showed in realtime from “top”.

Netperf:* is a benchmark that can be used to measure various aspects of networking performance. Currently, its focus is on bulk data transfer and request/response performance using either TCP or UDP, and the Berkeley Sockets interface. In addition, tests for DLPI, and Unix Domain Sockets, tests for IPv6 may be conditionally compiled-in.

Time:* The time command runs the specified program command with the given arguments. When command finishes, time writes a message to standard error giving timing statistics about this program run. These statistics consist of the elapsed real time between invocation and termination, the user CPU time and the system CPU time.

* from UNIX man page