T-76.4115 Final demo I2 Iteration 4.3.2008. 2 Agenda Product presentation (20 min) Project...

23
T-76.4115 Final demo I2 Iteration 4.3.2008

Transcript of T-76.4115 Final demo I2 Iteration 4.3.2008. 2 Agenda Product presentation (20 min) Project...

T-76.4115 Final demo

I2 Iteration

4.3.2008

2

Agenda

Product presentation (20 min) Project close-up (20 min)

Evaluation of the results Questions and discussion (5 min)

3

Introduction to the project

Business simulation games are used in educational purposes Some evidence of the learning results is needed

Project topic: Learning Assessment Tool For business simulations

Primary goals of the learning assessment tool project are the following:

To extract and store key performance data from the game system on an ongoing basis

To generate reports that compare the performance of a team or a group to

the average performance set performance targets in the game their performance in the previous rounds of the game

To provide what-if analysis to the players

4

Introduction to the project

5

Report generation

Product presentation:

Learning Assessment ToolLearning Assessment Tool

7

Weaknesses

Not all the wanted features have been implemented

The end-user interface is not polished

The source tree structure is not well organized

8

Strengths

Reports are created quickly from precalculated data

The internal architecture is good; the components are replaceable

The system can be extended quite easily

The database structure is good

The transfer and saving of game data has been tested extensively, and the client-side is configurable

Project close-upProject close-up

10

Project goals

1. Code quality and documentation • Code quality: 1) Code reviews. acceptance testing 2) Test

coverage.• 1) OK 2) ALMOST OK 64% (75% required)

• Code documentation: Javadocs• OK

• Arhitectural documentation: Reviewed with the customer• NOT OK (No update after I1?)

2. Functionality • Functionality: Reviewed with the customer

• OK

• Performance tests• NOT OK (tests on a virtualized server would not have

matched real conditions)

11

Project goals

3. Production Use • Data Collection: Augmented after the first iteration

• OK

• Analysis: Functionality and performance in production use is reviewed with the customer

• OK

4. User Documentation • Content: Reviewed with the customer

• OK

• Integration: Incorporating user documentation to Cesim's standard documentation build process

• OK (except architectural documentation)

Quality goals Evaluation of quality goals

Goal Status Description

QG001: Code quality 3Code has been re-factored and Java-docs have been written. All unit tests pass.

QG002: Document quality 3 Documents have been reviewed.

QG003: Defect handling 2We have followed specified defect handling process. Two open bugs.

QG004: Usability 2Tested by the peer group, who found some issues.

QG005: Functionality 3Agreed functionality has been developed.

QG006: Performance 2The system's client - server speed has been reviewed in I1. But not again in I2 with proper data.

13

Quality dashboard

14

Project metrics – Software

Integration testing

Was difficult to report and make visible Should have been planned more systematically

early in the project Many test cases implemented with JUnit can be

seen as integration tests

System testing

The group did not have experience on planning functional tests

System-level integration happened very late in the project

There were only a few different functionalities There was no time left for designing proper

functional tests No official strict way of reporting test sessions Loosely defined way of reporting tests was used instead

17

Project metrics – Quality

Code coverages

18

Project metrics – Quality

Total code coverages

19

Project metrics – Resources

The hour exceedings were done consciously The main reasons are the big challenges the group met

New technologies required lots of studying Unexperienced SE experts Difficulties in communication

Some members were persistent and responsible

1872722752293081481552202192013Whole project

68751316810873325447656Iteration 2 (Includes S4-S6)

9317312313814657959059974Iteration 1 (Includes S1-S3)

2624212354182876113383PP iteration (1.9.-24.10.2007)

MTJRMLJLTKEKMBPLNKTotalPeriod

20

Project metrics – Resources

PROGRAMMING 518

MEETINGS 346

DOCUMENTING 285

STUDYING 212

OTHER PM 211

QUALITY ASSURANCE 174

COMMUNICATION 153

INFRASTRUCTURE 74

TECHNICAL DESIGN 30

REQ. ENGINEERING 15

TASK TOTALS 2015

21

Project metrics – Resources

0

50

100

150

200

250

300

PROGRAM

MIN

G

MEETIN

GS

DOCUM

ENTING

STUDYING

OTHER P

M

QUALI

TY ASSURANCE

COM

MUNIC

ATION

INFRASTRUCTURE

TECHNICAL

DESIGN

REQUIR

EMENTS E

NGIN

EERING

PP

I1

I2

22

Risks

Materialized risks

Communication problems Inadequate commitment to the project due to other

than project activities (full day jobs, other studying) Short or incorrect understanding of the project

domain Lack of skill, knowledge or expertice Absence of momentum in working Lack of architecture description Unequally divided work

Problems with selected technologies

23

Work practices

What was good Process improvement meetings Working in small groups Agilefant WR time tracker

What was not Double documentation (Agilefant + DocBook, time tracker

+ Excel) New technologies that didn’t work Communication