ICOPER Reference Model Specification Draft - New...

92
D 7.3a ICOPER Reference Model Specification Draft ECP 2007 EDU 417007 ICOPER ICOPER Reference Model Specification Draft Deliverable number D7.3a Dissemination level Public Delivery date 31.5.2010 Status Final Editors(s) Bernd Simon (WUW), Mirja Pulkkinen (JYU) eContentplus This project is funded under the eContentplus programme 1 , a multiannual Community programme to make digital content in Europe more accessible, usable and exploitable. 1 OJ L 79, 24.3.2005, p. 1. 1/92

Transcript of ICOPER Reference Model Specification Draft - New...

Page 1: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

ECP 2007 EDU 417007

ICOPER

ICOPER Reference Model

Specification Draft

Deliverable number D7.3a

Dissemination level Public

Delivery date 31.5.2010

Status Final

Editors(s) Bernd Simon (WUW), Mirja Pulkkinen (JYU)

eContentplus This project is funded under the eContentplus programme1,

a multiannual Community programme to make digital content in Europe more accessible, usable and exploitable.

1  OJ L 79, 24.3.2005, p. 1. 

1/92

Page 2: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Contributors

Adam Cooper (Bolton)

Alexandra Okada (UNIVIE)

Anh Vu Nguyen-Ngoc (ULE)

Bernd Simon (WUW)

Daniel Müller (IMC)

Denis Kozlov (JYU)

Effie Law (ULE)

Israel Gutiérrez (UC3M)

Jad Najjar (WUW)

Jan Pawlowski (JYU)

Joris Klerx (KU Leuven)

Michael Derntl (UNIVIE)

Michael Totschnig (WUW)

Mirja Pulkkinen (JYU)

Peter Scott (OU)

Petra Oberhuemer (UNIVIE)

Susanne Neumann (UNIVIE)

Teresa Connolly (OU)

Tomaž Klobučar (JSI)

Tore Hoel (OUC)

Vana Kamtsiou (Brunel)

2/92

Page 3: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

3/92

Acknowledgements This draft specification documents a joint effort of the ICOPER project, an eContent+ Best Practices Network, funded by the European Commission. We would like to acknowledge all contributors to this document especially those not listed on the front page, who contributed indirectly to the evolution of our ideas, for example by participating in our discussions, or commenting earlier work. In addition to the editors the following contributors provided significant parts of this document were (in brackets the section they provided):

- Tomaž Klobučar, Vana Kamtsiou (Section 3.1 High-Level Context Scenarios) - Michael Totschnig (Section 5 Service Descriptions) - Adam Cooper (Section 6.1 Meta Model for Data Schemas) - Jad Najjar (Section 6.2 Personal Achieved Learning Outcomes) - Michael Derntl, Susanne Neumann, Petra Oberhuemer (Section 6.3 Learning Design) - Anh Vu Nguyen-Ngoc and Effie Law (Appendix B)

Tomaž Klobučar and Tore Hoel were the reviewers of this document.

Page 4: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

EXECUTIVE SUMMARY

The eContent+ Best Practices Network, ICOPER, aims at contributing to a more effective and efficient technology-enhanced learning in higher education (HE) and neighbouring educational contexts. The ICOPER Reference Model (IRM) provides a common frame of reference for stakeholders who wish to contribute in HE to the development of outcome-oriented teaching, and the design and development of content, TEL systems and applications that are interoperable both at the level of processes, as well as at the technical level (i.e. data and services). This draft specification of the IRM a meta-model consists of the following main elements:

- Domain Model - Process Models - Service Descriptions - Data Schemes

The previous ICOPER delivarable on the IRM development, D7.1, was dealing with the existing content specifications and standards, as well as other relevant standards in the domain. This intermediate version of the IRM is not concentrating on this aspect. The stand on specifications and standards will be revisited in the remaining project work. The IRM is developed following the research and developmnet cycle of design science (Pfeffers et al. 2007), with the following phases:

1. problem identification and motivation, 2. objectives for a solution, 3. design and development, 4. demonstration, 5. evaluation, and 6. communication.

The demonstration via implemented ICOPER prototypes and the sub-sequent evaluation play a crucial role in this endeavour, but yet to be delivered. Nevertheless several results can be documented based on proof-of-concept implementations. These results include:

- A domain model developed around key concepts such as learning outcome, learning design (including teaching method), learning content, learning opportunities and assessment

- A process model consisting of learner, learning facilitator, institution-led, and assessment processes

- Service descriptions for o search and retrieval service o publication service o user management service o recommendation service o harvesting service o registry service o validation service

- Data schemas for o personal achieved learning outcomes o learning designs

4/92

Page 5: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

TABLE OF CONTENTS

1  NTRODUCTION ....................................................................................................................................... 8I  

1.1 MOTIVATION AND MAIN OBJECTIVES ....................................................................................................... 8      

1.3  DOCUMENT STRUCTURE .......................................................................................................................... 101.2 WHAT IS A REFERENCE MODEL? ............................................................................................................... 9

 2 METHODOLOGY .................................................................................................................................... 11   3  OMAIN MODEL .................................................................................................................................... 18D  

3.1 HIGH-LEVEL CONTEXT SCENARIOS ......................................................................................................... 18   3.2  CONCEPT MODEL .................................................................................................................................... 21 

4  ROCESS MODELS ................................................................................................................................ 26P  4.1  T E META MODEL OF IRM PROCESS MODELS ....................................................................................... 26H  

 .1.2 Main Process Areas ...................................................................................................................... 28

4  4

4.2  PRIMARY PROCESSES .............................................................................................................................. 30

.1.1 Process modeling .......................................................................................................................... 26   

4.2.1 Learner Processes ......................................................................................................................... 30 

4.2.2 Learning Facilitator Processes ..................................................................................................... 32   

4.2.3 Institution-Led Processes .............................................................................................................. 34   

.2.4 Assessment Processes .................................................................................................................... 34    

.2.5 Content Provider Processes .......................................................................................................... 354  4

4.3  PROTOTYPE PROCESS EXAMPLES ............................................................................................................ 35   

4.3.1 Learning Facilitator Process Support: Design a Course (WUW Prototype) ................................ 36 

4.3.2 Learning Facilitator Process Support: Add Learning Outcomes to Lesson (WUW Prototype) .... 37   

4.3.3 Import Learning Designs in Elgg, Get Recommendations ............................................................ 38   

4.3.4 Learning Design use sub-processes (Giunti prototype) ................................................................ 39   

4.3.5 Assessment: Assessment Planning, Assessment Activities ............................................................. 40   

4.3.6 Publish Personal Achieved Learning Outcome in a third party environment (IMC Prototype) ... 41   

4.3.7 Example of services rendered by OICS for learning processes (UMU Prototype) ....................... 42      

4.3.8  Recommendation service: Example 2 of services rendered by OICS for learning processes ....... 43 5  ERVICE DESCRIPTIONS .................................................................................................................... 44S  

5.1 DESIGN PRINCIPLES FOR SERVICE DESCRIPTIONS .................................................................................... 44      

5.3  THE OICS MIDDLE LAYER API ............................................................................................................... 465.2 SERVICE TYPES ....................................................................................................................................... 44

 6  ATA SCHEMAS ..................................................................................................................................... 53D  

6.1 META MODEL FOR DATA SCHEMAS ........................................................................................................ 53      

6.3  LEARNING DESIGN DATA MODEL ........................................................................................................... 606.2 PERSONAL ACHIEVED LEARNING OUTCOMES ......................................................................................... 55

 7  CONCLUSION .......................................................................................................................................... 70 REFERENCES .................................................................................................................................................... 71 APPENDIX A: ENGINEERING EVALUATION ........................................................................................... 73 APPENDIX B: END USER EVALUATION PLAN OF PROTOTYPES ...................................................... 78 

5/92

Page 6: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

LIST OF FIGURES Figure 1. Methodological Foundation of the IRM Development (adopted from Peffers et al., 2007-8) ..................................................................................................................................... 11 Figure 2. ICOPER Key Concepts ............................................................................................. 12 Figure 3. Meta Model of the IRM ............................................................................................ 15 Figure 4. IRM Concept Model ................................................................................................. 25 Figure 5. BPMN Basic Shapes ................................................................................................. 27 Figure 6. Pool Structure, Actor Names and Aliases ................................................................. 28 Figure 7. Scenario Pool Example: Re-skilling Scenario .......................................................... 28 Figure 8. The IRM main process areas, 0-Level ...................................................................... 29 Figure 9. Learner’s Processes ................................................................................................... 30 Figure 10. Learner-centered Scenario ...................................................................................... 31 Figure 11. Learner-centered Scenario, Phases 2 and 3, Learning, Assessment and PALO Management ............................................................................................................................. 32 Figure 12. Learning Facilitator Process areas and processes ................................................... 32 Figure 13. Sub-Processes of Learning Facilitator, case Preparing Learning Designs ............. 33 Figure 14. Institutional Processes (Strategy, Management, Administration) .......................... 34 Figure 15. Assessment Process Activities ................................................................................ 35 Figure 16. Personal Achieved Learning Outcomes – Diagrammatic Representation .............. 56 Figure 17. Learning Design Data Model – Diagrammatic Representation .............................. 61 

LIST OF TABLES

Table 1. Examples of Concept Definitions, Problem Statements, and Target Audience ......... 14 Table 2. IRM Prototypes .......................................................................................................... 16 Table 3. IRM Concepts including Definitions ......................................................................... 21 Table 4. Data Model Table Structure ....................................................................................... 54 Table 5. Personal Achieved Learning Outcomes – Table Representation ............................... 56 Table 6. Learning Design Data Model – Table Representation ............................................... 63 

6/92

Page 7: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

LIST OF ACRONYMS

BPMN Business Process Modelling Notation

ELF E-Learning Framework

EQF European Qualification Framework

HE Higher Education

HEI Higher Education Institution

HR Human Resource

IRM ICOPER Reference Model

LOM Learning Object Metadata

OER Open Educational Resources

OICS Open ICOPER Content Space

PALO Personal Achieved Learning Outcomes

PLE Personal Learning Environment

SOA Service-Oriented Architecture

TEL Technology-Enhanced Learning

WP Work package

7/92

Page 8: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

1 Introduction

1.1 Motivation and Main Objectives

On 14 February 2008 the European Council formally adopted the European Qualification Framework (EQF) as a central means towards a more outcome-oriented way of delivering higher education. This revived emphasis on outcome-oriented teaching and learning is considered to be highly beneficial to higher education intuitions and their stakeholders. Some of the benefits associated with outcome-orientation include, but are not limited to:

• Outcome-orientation helps to ensure consistency of course delivery within study programs.

• Outcome-orientation highlights the dependencies between teaching, learning and assessing.

• Learning outcomes cascade from study program level to module and course levels, ensuring subject consistency and helping to identify overlaps.

• Outcome-orientation empowers students to make more informed choices on study programs and learning paths.

• Outcome-orientation increases transparency for different groups of stakeholders • Outcome-orientation provides a better linkage between employment, vocational

training and higher education. However, higher education institutions are still struggling to shift away from the traditional approach of describing learning opportunities emphasizing learning inputs (e.g. length of a learning experience) towards learning outcomes. Beyond the organisational and intellectual challenges of developing outcome-oriented learning opportunities, the technical challenge of modelling learning outcomes and relating them to various educational resources remains. Hence, the interoperability of learning outcome definitions constitutes a major challenge for competency-driven learning designs and implementations. Up-to-now, continuing education centres and higher education institutions have not yet been offered standards for learning outcome definition, resulting in inefficiencies for, both, outcome-oriented learning design and competency-driven content access and re-use. The eContent+ Best Practices Network ICOPER is developing a reference model for introducing outcome-orientation in higher education. The ICOPER Reference Model (IRM) is out to contribute to the design of outcome-oriented learning. Although a significant number of standards and specifications have been produced in the last decade, access to educational content is still limited. Lack of interoperability affects content developers, technology providers, instructors, and learners. The unsatisfying situation has a significant impact on the European Higher Education Area, where content developers, for example, suffer from increased production costs. Technology providers, on the other hand, find it hard to judge the importance of being compliant to the plethora of standards and specifications, struggling to incorporate their implementations in the release plans of their products. Higher education institutions and continuing education centres still lack re-use of the costly developed content. There were hopes that digital learning material would support new business models by supporting new forms of less formal learning. These hopes have yet to be fulfilled.

8/92

Page 9: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

At the same time these institutions are confronted with new waves of technologies such as Web 2.0 where content production is outsourced to the consumer and learners become increasingly active in social networks while instructors are still confronted with a closed world of tools and systems hindering an effective implementation of learning opportunities or the outreach of their knowledge beyond organisational boarders. Learners suffer from limitations in their learning experience due to inefficient access mechanisms and a limited pool of learning content. Overall, the confusion around the applicability (fit-for-purpose) of standards and specifications in technology-enhanced learning results in a lack of adoption, which consequently has a profound negative impact on learning in the European Higher Education Area. The eContent+ Network of Excellence ICOPER aims to contribute to a more effective and efficient technology-enhanced learning in higher education and neighbouring educational contexts. The ICOPER Reference Model is out to contribute to the interoperability of outcome-oriented teaching, both, on a process as well as on technical level (i.e. data and services). This draft specification of the IRM aims at providing a base for discussion within the ICOPER consortium and the WP teams and secondly to initiate discussion with a broad audience of experts representing higher education institutions, content providers, instructors, and learners.

1.2 What is a Reference Model?

A reference model is attributed to be universal within a domain and to have the character of a recommendation (Thomas 2006). However, not only the declaration of a model as a “reference model” is sufficient for a model really to claim this status, but it also has to be accepted within the intended user community as one. Thomas (2006) goes on to define:

“The term reference model can be explained as a concretion of the term “information model” on the basis of the constituent attribute of user-sided acceptance: A Reference model – specifically: reference information model – is an information model used for supporting the construction of other models.”

However, to be functional as a collaboration tool, a reference model should cover also other aspects besides information. Business processes, information, applications and technological architectures are most common parts covered in workable models (The Open Group 2009). As compared to other domains, the field of TEL does not a have a broad palette of available reference models. The most notable TEL reference models that are relevant to the IRM are:

1. JISC E-Learning Framework (Wilson et al., 2004) 2. FREMA (JISC, 2005) 3. SCORM (Wisher, 2009)

The JISC E-Learning Framework (ELF) is a service-oriented factoring of the core services required to support e-Learning applications, portals and other user agents. Each service defined by the Framework is envisaged as being provided as a networked service within an organisation, typically using either Web Services or a REST-style HTTP protocol (Wilson et al., 2004). The ELF is relevant to the IRM, since both the IRM and the ELF focus on technical services for the purposes of E-Learning. The service descriptions of the IRM have already been inspired by ELF and more IRM work based on ELF is foreseen for D7.3b. FREMA (JISC 2005) is a reference model for systems in the Assessment domain of E-Learning that are built on top of Service-Oriented Architectures, such as Web Services and the Grid, and in particular the JISC E-Learning Framework. The Assessment Reference

9/92

Page 10: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Model describes how the assessment domain maps to the ELF and thus acts as a driver for implementation and evaluation. The Assessment Reference Model eases the development of further services and promotes the re-use of existing ones (JISC 2005). Learner assessment and learning process evaluation are integrated parts of the IRM processes and service level. Consequently, FREMA provided useful recommendations how to define and implement the sub-processes comprising learner assessment and learning process evaluation. In turn, the relation of FREMA to SOA can be beneficially used to enhance the service classification derived from the ELF. SCORM, the Sharable Content Object Reference Model, integrates a set of related technical standards, specifications, and guidelines designed to meet ADL’s functional requirements, such as accessibility, interoperability, durability, and reusability of content (Wisher, 2004) – mainly in a distance learning setting Initially SCORM was developed to support self-study modules designed for use by the U.S. Department of Defence. SCORM enables the communication between learning objects and learning management systems, usually using sequences as well as assessments and grading procedures. This has led critics to suggest that SCORM is not flexible enough to allow for a variety of pedagogies (Welsch, 2002). Another limitation of SCORM is that it does not meet the requirements of many stakeholders in the European community within learning, education, and training. One of the reasons is the main paradigm of SCORM: the focus on distributed content delivery to a single learner in a distance learning scenario. It neglects recent (mainly European) approaches which are currently adopted and improved by a wide European user community, specifically learning design, and outcome-based learning.

1.3 Document Structure

This introduction is followed by a methodology section, where the methodology applied and a documentation of the status-quo is provided. Section 3 presents the underlying domain model. The process models are documented in Section 4. In Section 5 the service descriptions of the IRM are provided. Section 6 introduces the data models developed so far. In Section 7 the main contributions of this document are summarized.

10/92

Page 11: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

2 Methodology

The intention of this guideline is to meet the following needs: 1. For the construction of the IRM, as a community model, commonly agreed guidelines

are needed to aid and support the distributed development work in different work areas by the ICOPER partners as well as work packages. This methodology shall ensure an alignment between partners and work packages.

2. The IRM is aimed at communication over standards in the field of TEL, and to be developed to a specification that qualifies at the level of a standard. This is why it should follow and be backed-up with existing standards in modelling and description techniques.

3. The IRM is intended to be retained as a living model within the community of TEL, and as such it will evolve also in the future. To support and enable the further evolution, the IRM is furnished with a development guideline.

As proposed in the ICOPER deliverable D10.2, a design science research methodology has been adapted for the development of the IRM. The underlying Design Science Research Methodology as depicted in Figure 1, consists of a sequence of 6 activities (Peffers et al., 2007-8):

1. problem identification and motivation, 2. objectives for a solution, 3. design and development, 4. demonstration, 5. evaluation, and 6. communication.

Though the activities are ordered sequentially the process may start at any of the first four steps, and follow back to phases of objectives definition, design and development, and demonstration. The entry point depends on the nature of the problem and triggering factors. However, in the case of ICOPER we started with step 1, and followed the subsequent steps as proposed.

Figure 1. Methodological Foundation of the IRM Development (adopted from Peffers et al., 2007-8)

11/92

Page 12: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

12/92

Problem identification and motivation Initiated by the description of work and subsequent discussion within the TEL community the following problem areas have been identified:

1. design of learning outcome definitions and unified descriptions of learning opportunities,

2. sharing of teaching methods and learning designs, 3. propagation of learning content re-use, 4. learning design driven delivery of learning opportunities, 5. development and re-use of assessment resources.

For each of the above-mentioned problem areas, several problems the IRM aims to solve have been defined. The significance of the problem and the value of a potential solution are justified by external experts1 in order to motivate researchers and wider audience to pursue the solution and accept the results. Important stakeholders were identified and involved in the preparation process of developing and evaluating the proposed solution. A common understanding of the problem and basic terms among all project partners represents a basis for successful work on the problem. Table 1 exemplifies the outcome of this phase with respect to the two IRM key concepts Learning Outcome and Learning Design.

Figure 2. ICOPER Key Concepts

1 This process is still ongoing.  

Page 13: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

13/92

Objectives for the solution In this step the identified problem is translated into requirements. In the first phase of the IRM development, we have identified the following requirements for the model in an open discourse (see Fettke & Loos 2003, Frank 2007):

• Support of Outcome-Based Learning: The IRM shall enable outcome-based learning. • Interoperability: The IRM will consequently support the development of interoperable

systems and solutions. According to IEEE, interoperability means “the ability of two or more systems or components to exchange information and to use the information that has been exchanged” (IEEE 1990). The IRM purpose is to support the development of interoperable TEL systems and applications, with the focus on content management to enable content interoperability in TEL and more generally, e-Learning.

• Compliance: The IRM shall be compliant to existing standards and specifications and provide guidance towards the adoption of those standards and specifications.

• Re-Use: The IRM shall support re-use processes, in particular the re-use and improvement of learning content.

• Service-Oriented Architecture: The IRM is based on the services orientation approach, which means loose coupling of applications through standardized interfaces. Specifications of content and meta-data enable the access to content as well as e.g. learner information through services supporting various e-Learning activities, enabling content interoperability across systems and applications. Building interoperable services means that both the content, the information needed in the service, and the process of service provisioning are modelled. This gives the ‘layer’ structure of the IRM with process layer, services layer and data layer. Service orientation is the mainstream in building Internet based systems and applications. Consortia like W3C and OASIS provide the basic specifications, standards and models for this.

• Extensibility and Adaptability: The IRM shall be extensible to include specific requirements and needs of a specific educational context and it should as well be adaptable to incorporate specific needs and requirements of stakeholders.

• Participation: The IRM will be developed based on a participatory approach to involve stakeholders from different organizations and interest groups.

• Completeness: The IRM shall cover state-of-the-art TEL scenarios and related systems.

• Harmonization: The IRM includes, incorporates and harmonizes successful good practice approaches.

Page 14: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Table 1. Examples of Concept Definitions, Problem Statements, and Target Audience

Concept (Definition and Example)

Learning Outcomes refer to statements of what a learner knows, understands and is able to do on completion of a Learning Opportunity (European Commission 2008). "The student is able to list a number of learning technologies and their properties." is an example of a Learning Outcome.

A Learning Design is a reusable representation of a concrete Learning Opportunity. A Learning Design arranges Teaching Methods, Assessment Methods, Learning Content and Learning Tools towards Learning Outcome attainment. A sketch of a Learning Design can for example be described as follows: "After taking this course a student is able to list a number of learning technologies and their properties. In order to achieve this learning outcome we will ask the students to attend a presentation on learning technologies that will also include some demos. After the presentation the student will be confronted with a short test."

Interoperability Problem

The ICOPER Project is concerned with the interoperability of Learning Outcomes, for example, when Learning Outcomes are provided for re-use in the planning of courses, i.e. the creation of Learning Designs, or when students after successful completion of a course aim at including these Learning Outcomes in Personal Achievement Profiles.

In the context of such a scenario ICOPER aims at facilitating collaboration around Learning Designs starting with the creation of Learning Designs out of respective Learning Opportunities, the sharing of Learning Designs as well as finding peers based on Learning Designs.

Target Audiences / Stakeholders

Programme Director Faculty Learner Employer

Faculty Programme Director

Relevant Standards and Specifications

IEEE LOM and Profiles, IEEE RCD and Profiles, Personal Achieved Learning Outcomes, ICOPER Middle Layer Specification

IEEE LOM and Profiles, IEEE RCD and Profiles, IMS Learning Design, Personal Achieved Learning Outcomes, ICOPER Middle Layer Specification

14/92

Page 15: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Based on these model requirements a meta model was created that consists of the components as depicted in Figure 3. Process Models where educational-relevant processes are formally described constitute the centre of our IRM. In order to support interoperability on system-to-system communication level unified service descriptions as well as data schemas are part of the IRM. Process models, service descriptions, and data schemas are contextualized in a domain model consisting of high-level scenarios and a concept model.

Figure 3. Meta Model of the IRM

Design and development Artefacts – in our case prototypes – were and still are designed and developed in this step. Quality criteria for the artefacts depend on the nature of the artefacts. Criteria for products and services include scalability, simplicity and flexibility of interfaces, versioning and smaller project milestones, user involvement, and compatibility on different levels such as

- communication protocol, - communication interface, - (meta)data access, - (meta)data types, - application semantics, - application functionality, - process and policy.

The work is based on existing practices and tools in technology-enhanced learning. Table 2 provides an overview of the prototypes currently developed and the key concept under focus.

15/92

Page 16: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Table 2. IRM Prototypes

IRM Prototype Partner Key Concept

LD.OpenGLM.UNIVIE -- Open Graphical Learning Modeller

UNIVIE Learning Outcome, Teaching Method, Learning Design

Rec.Moodle.UMU-KTU --Recommendation widgets in Moodle

UMU Learning Design

Ass.LRN.UC3M – Planning and Managing Assessment in .LRN

UC3M Assessment Resources

Fin.Moodle.UMU – Outcome based finding of learning opportunities

UMU Learning Outcome, Learning Design

VE.Moodle.UMU – Viewing and exporting learning outcome profiles

UMU Learning Outcome, Personal Achievement Profile

SR.ELGG.TLU – Outcome-based search and recommendation of Learning opportunities in Elgg

TLU Learning Design, Learning Outcome, Personal Achievement Profile

PALO.Facebook.AGH – Facebook Learning Outcomes Profile Application

AGH Learning Outcome, Personal Achievement Profile

CD.LEARN@WU – Learning Outcomes Oriented Course Design in Learn@WU

WUW Learning Outcome, Teaching Method, Learning Design

LO.LearneXact.Giunti – Outcome-based management of Learning Opportunities in learn eXact Enterprise

Giunti Learning Design, Assessment, Personal Achievement Profile

Aut.MediaLibrary.HUM-OUNL – Media Library Authoring Toolset

OUNL, HUM

Learning Content

2know2.com: LO.2Know2.KM – Learning Market Place and Competency Records

KM Learning Outcome, Personal Achievement Profile

Aut.xoLRN.KM learning content management and authoring

KM Learning Content, Assessment Resources

16/92

Page 17: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

17/92

Demonstration The prototypes must be able to demonstrate that they are able to solve the identified problems. The next phase of the project is devoted to this activity.

Evaluation In this step we measure how well the developed artefacts solve the problem. In ICOPER three types of target audience need to be dealt with when it comes to evaluation, which ultimately leads to three types of evaluations:

1. with implementers, tool developers and technology providers 2. with instructors, learners, curriculum developers, administration or

management 3. with researchers and standardization bodies

Appendix A describes the evaluation strategy for the engineering evaluation, Appendix B does the same for end user evaluation.

Communication The problem, its importance and solutions are communicated to wide audience via different ICOPER channels, such as dissemination, liaison, training and consultation activities.

Page 18: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

3 Domain Model

3.1 High-Level Context Scenarios

The following four high-level context scenarios provide an introduction into the domain of higher education and neighbouring fields.

Scenario 1: Institution-managed development of outcome-based learning

Information security is becoming an essential requirement for every aspect of modern society. Unfortunately current university programmes (computer science, social science, criminal justice) tackle this interdisciplinary field from their points of view only and as such do not “produce” the complete experts that companies need. To meet those needs the University of Adriatic has decided to create a new master programme on information security. The main goals of the study programme are defined in a dialogue between representatives of the higher education institution, professional associations and relevant companies. The learning outcomes students will obtain are the crucial point when creating the curriculum. A special working group at the institutional level and learning professionals (teachers, facilitators) are involved in the definition of learning outcomes that are in line with strategic goals and in the development of the programme. Developed learning outcomes are validated with the companies and professional associations. Budget restrictions imposed by the Ministry of Education are taken into account when defining the timeframe of the programme, the number of total course credits (ECTS) and the number of learning professionals (teachers, tutors) involved. The university can offer the new study programme once it is evaluated and accredited by a national accreditation body on the basis of the full documentation that includes a detailed description of the learning outcomes of the programme and all courses. Evaluation of the programme implementation is regularly performed in order to assure a high quality. In the new programme Professor Bongo has been selected as responsible instructor for computer forensics teaching. As part of the preparation of his course (i.e. learning opportunity) he defines in collaboration with other members of the information security laboratory a detailed list of general and subject specific learning outcomes for the course. Intended learning outcomes learners will obtain in the course follow the general programme outcomes defined at the institutional level. Then Professor Bongo selects appropriate teaching methods, existing learning designs and assessment methods. With the help of a content developer he selects, updates and prepares learning content and tools, and combines everything in a new learning design. Before the course starts Professor Bongo uploads the learning design he created to his institution's learning management system (in this case Moodle version 1.6) and offers it under Creative Commons licence for re-use especially for other instructors in the domain. The learning opportunity created is now offered to students that are eagerly subscribing to the course. After their engagement in the electronic learning environment the learning outcomes of the course are added to the Professor Bongo's “taught” learning outcomes. Exporting this part of the profile enables Professor Bongo to get in contact with other learning professionals that teach similar courses.

18/92

Page 19: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Scenario 2: Development of corporate competencies

In the light of a company’s strategy to become a leading manufacturer of electric cars they are preparing themselves to start a project on developing a new generation of in-wheel motors. This will require a change in various business processes and new skills and competencies to be obtained by certain employees. Based on the overall project goals, business processes and outcomes set by the company management, a human resource developer analyses existing and required learning outcomes on individual and department levels and identifies knowledge, skill and competence gaps for a R&D group, which will be responsible for the new project. The analysis is done within a corporate learning environment that contains the professional learning outcome profiles of every employee. As a next step the internal HR development team designs in cooperation with an external higher education institution a learning plan for the R&D group and identifies required learning designs. Here, financial restrictions and timelines are an important factor during the planning and design process. The company and the higher education institution also clearly define their intellectual property rights regarding the created learning content.

During the development of the new project, the R&D group is working close with the HR development team to develop the necessary learning opportunities. Learning designs are linked to business processes in order to be able to measure learner performance in relation to the defined learning outcomes and business needs. Joint face-to-face training activities and predefined learning activities within the corporate learning environment supplement learning at workplace. Obtained learning outcomes are verified by means of tests and monitoring of working processes. After the completion of a training measure the obtained learning outcomes are stored in learners’ portable personal achieved learning outcome profiles.

Scenario 3: Professional development

Melissa is a motivated young computer professional, working as a programmer in a big software company. Already as a teenager she was dreaming of becoming an entrepreneur in the area of computer games and founding her company before the age of 30. The computer science programme Melissa attended at a local university lacked many learning outcomes Melissa would be required to attain in order to pursue her goal, especially in management and finances. Melissa is also required to regularly update her computer science knowledge and skills to be able to follow-up the rapid scientific and technological developments in her industry. In an attempt to organize an individual learning path Melissa first needs to analyse her knowledge, skills and competence gaps and clearly define learning goals in terms of intended learning outcomes. A free on-line service at a career development agency helps her analysing those gaps by automatic semantic matching of her achieved learning outcomes to the competencies data the agency has for different occupations and positions. Missing learning outcomes are represented in Melissa's profile as her learning needs. Based on the identified learning needs, she decides that her first goal is to obtain competencies in project management and leadership. Since her studies at the local university Melissa has been maintaining her personal learning environment (PLE) that supports self-directed learning and collaboration. The PLE’s tools and services enable Melissa to find learning opportunities that best suit her intended learning outcome at different educational institutions as well as other users. The tools and services also keep notifying Melissa about the changes and upgrades of the computer science programme from her alma mater and suggest topics she should learn to stay in touch with the latest developments. Her obtained learning outcomes are taken into account during the selection of the best-suited learning opportunities. As Melissa does not speak other languages than English and French, the

19/92

Page 20: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

search tool filters learning opportunities in those two languages when presenting search results. From the list of learning opportunities Melissa selects a blended course on project management that also enables her to obtain competencies in leadership. The course is provided by a local business school. Melissa uses a variety of social software tools from her PLE when interacting with learning facilitators such as her teacher and his assistant as well as other learners. For the final assessment in the course she has to prepare a small project and lead a group of peer students who will help her implementing the project. Melissa is obtaining knowledge, skills and competencies from formal learning opportunities - found and recommended by her PLE -, as well as from informal learning activities, e.g. by an active participation in an open source games developing community. The learning outcomes she obtained in the context of formal learning opportunities are usually also formally assessed by the institutions providing the learning opportunity. On the other hand most of the user-generated learning opportunities she finds, contain self-assessment methods and resources for assessing the outcomes. In order to obtain formal certificates for some of those (informally obtained) outcomes she takes assessment tests at a local higher education institution, lifelong learning centres, and vocational training institutions. Achieved learning outcomes are stored in her learning outcome profile. Melissa’s learning outcomes can thus be proved by means of official evidence records obtained from different educational institutions and also verified on the basis of achievements collected in her ePortfolio.

Scenario 4: Re-skilling

The recent economical crisis affected almost all European regions and economical sectors. In the period from Sept 2007 to Sept 2009 the overall unemployment rate in EU-27 rose from 7.1% to 9.2%, with extreme cases of Ireland (from 4.6% to 13%), Latvia (from around 6% to 19.7%) and Spain (from 8.6% to 19.3%). However, on the other side there still exist occupations where the demand is lower than the offer or where the people taking job positions are not qualified enough for the job, for example in natural sciences and technology teaching. One of the government’s attempts to reduce unemployment rate is a newly established community programme that motivates people from particular sectors, e.g. financial, to be re-skilled and get another occupation. Maria worked as a junior financial analyst in a large investment bank that recently went bankrupt. Ever since she has not been able to find a new job in the financial sector for the past 6 months, she decided to take an opportunity the government programme offers and obtain few missing qualifications (mostly pedagogy-related) for a mathematics teaching position. The programme helps her by provision of a social community supported by a web portal where people interested in the programme can meet and learn together, a list of required learning outcomes, an overview of various accredited educational providers and learning opportunities, and financial subsidy to cover her education costs. Maria goes to the web portal to choose a set of courses that allow her to obtain all missing learning outcomes and at the same time not exceed the funds she was granted by the government. Based on the provided learning outcomes she wants to obtain the portal suggests blended courses and grouping with other learners according to their background and achieved learning outcomes. Maria chooses a few courses from the suggested list, enrols into the courses and learns together with other peer learners. Assessment of obtained learning outcomes is done by the educational providers in the context of the courses. Assessment records are taken as evidence for Maria's achievements and stored in Maria's personal achieved learning outcomes profile. The obtained learning outcomes enable Maria to get national qualification for the selected occupation.

20/92

Page 21: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

3.2 Concept Model

In the above sections already several key concepts of the IRM have been introduced. In this section we present a consolidated summary of these key concepts. Beyond describing the IRM’s domain, the ICOPER Concept Model also serves as an introduction to the ICOPER Reference Model as such, since many of the key concepts are re-used in the IRM Data Schemas and the other parts of the IRM. Hereby the ICOPER Concept Model constitutes a cluing component between the various parts of the IRM. The ICOPER Concept Model as published in this document consists of

• a publicly-shared visualization in a concept map1 • concept definitions including synonyms

The key concepts are illustrated in a concept map (see Figure 4). The ICOPER Concept Model applies the following design principles: 1. Concepts neither contradict with common practice nor with scientifically grounded

definitions. 2. Concept titles are short and precise. 3. Concept titles shall not be misleading and shall avoid confusion (e.g. stress difference

between assessment and evaluation by adding the feedback focus to the concept like learner assessment or learning process evaluation).

4. Concept titles stay accurate throughout the life cycle of a concept (e.g. avoid titles such as _intended_ learning outcome).

5. Concept definitions re-use other concepts when possible and introduce synonyms. Synonyms are defined as words with identical or very similar meaning.

6. The ICOPER Conceptual Model is based on widely adopted concepts while supporting ICOPER's vision of introducing "outcome-oriented teaching and learning" in the European Higher Education Area at the same time.

7. The ICOPER Conceptual Model is easy to read. Therefore the concept model focuses on important key concepts only. Various ICOPER Work Packages can provide additional concepts in work package specific concept models.

8. The ICOPER Conceptual Model is independent from an implementation paradigm (e.g. Object-Relational Implementation, Description Logic). Therefore a simple syntax for visualisation is used that is based on concepts and directed, labelled concept relationships.

9. The ICOPER Conceptual Model is derived from ongoing work in the various ICOPER work packages.

10. Stakeholders of various parts of the IRM and in particular the ICOPER work package leaders can agree on the published version of the ICOPER Conceptual Model.

These design principles refer to both, the documentation and the development process of the ICOPER Concept Model. The outcome of this development process is documented in the following table. Table 3. IRM Concepts including Definitions

1  http://www.icoper.org:8080/rid=1H50YCXCC‐1T39P24‐1ST/ICOPER%20Reference%20Model%20%28IRM%29  

21/92

Page 22: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

ID Concept Definition Synonym(s) Reference(s)

1.1 Sharable Educational

Resource

A Shareable Educational Resource is an addressable object in a Repository that is relevant in the context of learning and teaching. It is described via metadata and identifiable through an identifier.

Learning Object

1.2 Repository A Repository is a managed directory for Shareable Educational Resources.

Pool, Collection

1.3 Referatory A Repository that only contains metadata and not the resources themselves is called a Referatory.

1.4 Metadata Metadata is data about data.

2.1 Personal Achievement

Profile

A Personal Achievement Profile is a collection of Learner’s Achievements.

2.2 Achievement Achievement refers to a potentially individualized description of an attained Learning Outcome. An Achievement can also be evidenced by an Assessment Record.

2.3 Learning Outcome

Learning Outcome refers to statements of what a learner knows, understands and is able to do on completion of a Learning Opportunity.

European Commission 2008, p.

11

2.4 Context Context is the set of educational elements that is external to and adds meaning to an achieved learning outcome or Learning Opportunity. An example of an educational context of a course is its study programme.

Educational Context

Azouaou & Desmoulins 2006

3.1 Teaching Method

A Teaching Method is a learning outcome oriented set of activities to be performed by Learners and Learning Facilitators (Supporters). Examples for teaching methods are the lecture method, problem-based learning, and the think-pair-share method. Teaching methods are described using a teaching method description template. Typically, teaching methods are generic descriptions of activities, independent of specific content or an application context. Teaching methods are realized in units of learning within a specific context and with associated content.

22/92

Page 23: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

ID Concept Definition Synonym(s) Reference(s)

3.2 Learning Design A Learning Design is a reusable representation of a concrete Learning Opportunity. A Learning Design arranges Teaching Methods, Assessment Methods, Learning Content and Learning Tools towards Learning Outcome attainment.

Learning Opportunity

Design, Unit-of-Learning at Design Time

3.3 Learner A Learner is a person that performs learning activities in the context of a Learning Activity to attain intended Learning Outcomes. A Learner is also a high-level role that can be specified by a specific Teaching Method with various concrete roles (e.g. in the Jigsaw Teaching Method learners assume the role of experts, presenters, and so forth).

3.4 Learning Facilitator

A Learning Facilitator is a person that supports the Learner during the activities as carried out in a Learning Opportunity. Learning Facilitator is also a high-level role that can be specified by a specific Teaching Method with various concrete roles. Typical learning support roles are teacher, instructor, facilitator, external expert, moderator, etc.

4.1 Learning Content

Learning Content refers to any digital and non-digital material that can be used in a Learning Opportunity. Examples of Learning Content are Web pages, lecture slides, a textbook, SCORM-compliant Web-based trainings, etc.

Learning Object

4.2 License A License defines terms and conditions for re-using content for a particular target audience. An example for a license is the Attribution-Non-commercial-No-Derivative Works 3.0 Austria Version of the Creative Commons License Framework.

4.3 Rights Holder A Rights Holder is an identified legal entity including contact details holding rights with content.

5.1 Learning Opportunity

A Learning Opportunity refers to a contextualized, complete, self-contained unit of education or training that implements a specific Learning Design in a particular physical or virtual location. Examples of Learning

Learning Opportunity

Instance

Adapted from Olivier & Tattersall 2005, p.

25

23/92

Page 24: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

24/92

ID Concept Definition Synonym(s) Reference(s)

Opportunities are web-based learning modules, face-to-face courses, instantiations of study programs, etc.

5.2 Learning Tool Learning Tools are deployed technologies that are used in, and in this way facilitate the support of, a Learning Opportunity. Examples of learning tools are a blog, chat, forum, wiki, etc.

Collaboration Service

6.1 Assessment Resource

An Assessment Resource is a special type of Learning Content used for the assessment of a learner’s learning activities, thus stimulating some kind of interaction or reaction by the learner.

6.2 Assessment Method

An Assessment Method is the way of deployment the assessment process, formalised into a set of specifications, which fully characterise the process regarding its dimensions.

6.3 Learner Assessment

Learner Assessment describes the process of testing the Learning Outcomes attained by a Learner and providing the corresponding information reporting about the Learner’s achievements or potential indications for improving them.

6.4 Assessment Record

An Assessment Record captures information of evidence that a Learner has obtained a specific Learning Outcome.

Grade Record

6.5 Assessment / Evaluation

Design

Specifies the type of evaluation (e.g. formative vs. summative) and the instruments (i.e. questionnaires) used.

6.6 Learning Process

Evaluation

A Learning Process Evaluation describes the process of collecting feedback on a Learning Opportunity either for a learner’s self-reflection (formative evaluation) or for measuring its success (summative evaluation).

Page 25: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Figure 4. IRM Concept Model

25/92

Page 26: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

4 Process Models

4.1 The Meta Model of IRM Process Models

The IRM contains a description of key process areas for the development, use, and improvement of outcome-based learning in Higher Education learning context with OER. ISO9001: 2000 definition of process is a “set of interrelated or interacting activities which transforms inputs into outputs”. ISO also points to the difference of process, which is describing activities, and the specific and normative procedure, that is used e.g. for defining software application functionalities. For HE stakeholder activity, we take main process areas, to which belong processes and sub-processes significant for TEL developments. For prototypes developed, procedure-level descriptions are given. Thus the stakeholder process areas illustrate the relevant activity supported by the services provisioned by the prototype implementations, including the OICS environment. This means also, not all HEI processes are covered. Combining the common structure for models (see 4.1.1), main process areas (4.1.2), the primary processes (4.2), and the prototype examples (4.3), a common frame of reference emerges to illustrate which activity a prototype will support, and to study what the possible benefit of such an application and the content and meta-data standards used by the application and the repository will bring to the stakeholders.

4.1.1 Process modeling The Object Management Group’s (OMG) Business Process Modeling Notation (BPMN) V2.0 beta has been used for graphical process modelling. Sophisticated tools can transform between these formats of process information. BPMN meta-model is versatile, enabling the representing of process structures in models at varying levels of aggregation. BPMN is able to provide notation also in greater detail as used in IRM. BPMN enables automated mappings to the XML based Business Process Execution Language. Process identification and process data for performance metrics etc. can be held in a so-called process card, that is a table or chart capturing key information and an ID of a process. For this, the ISO/IEC 19796-1:2005 standard provides a template. Process information is collected for process metrics in initiatives of process improvement and optimization. Sophisticated process tools display information both as flow diagrams and as cards, and in the further IRM development we aim to provide such consistent account for the processes. Figure 5 presents the basic elements of the BPMN modelling language for simple process flow diagrams. A Start event triggers the process to begin, a Sequence flow takes the process from one phase to another. A rounded rectangle marks an Activity, Task or a Sub-process, and a dotted arrow marks a message being sent between two points. A Gateway is a point where the process flow can diverge, and the process ending is marked with an End event. Events can be marked with symbols to e.g. messaging or link. The process can be presented in a Pool that marks the organization, with separate Lanes for different roles or actors participating in the process at different phases. Data element represents items of processed information. All these elements have element data specifications. The models can be refined with further elements. For more detailed process information, please refer to the specification document.

26/92

Page 27: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Start event

End event

Sequence flow

Message flow

Gateway

Gateway marks a decisionpoint in the process, where the process can diverge to different flow directions:

- Different types with different symbols: x, +, *- Default: decision (x)

Activityorsub-process

POOL (the teaching – learning organization, with multiple roles /

actors)Lane (for one role / actor)

Intermediate event

By adding an envelope symbol to an event, (start, intermediate or end event) you can mark it as a message: the event sends a message pointed out by the message flow.

+

The + sign at the middle of the

bottom line marks the sub-process ’collapsed’. The

process within can be displayed in

another diagram

For each Role or Actor in the process, there is a Lane. All activities or sub-processes that are carried out by this actor, are displayed within

this lane.

+

Item of data; e.g.

Shareable Learning

Resource, Learner Data

A block arrow symbol in the ’event’ symbol marks a link.

Data

The arrow sign at the middle of the bottom line marks

the sub-process as ’recurrent’.

Figure 5. BPMN Basic Shapes The ICOPER Reference Model processes are derived from two directions, taking firstly, the top-down approach, with the IRM Context Scenarios described in Section 3.1. These take the Learner view and the views of the other learning stakeholders to the activities in the domain. The second, bottom-up approach was taken through the work on the ICOPER Open Content Space (OICS), and the numerous prototypes of applications and services for learning support that were developed around the OICS in the project. To provide a common platform for the top-down scenario processes, as well as the bottom-up application processes, the Pool element of the BPMN language is used. We define the IRM Pool, which consists of the Lane elements for each Actor recognized in the domain

27/92

Page 28: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

conceptual model as an Actor in this domain. The main human actors are Learner (a.k.a Student, Employee) and Learning Facilitator (a.k.a Teacher, Tutor, Assistant, Learning Supporter), the lanes for which are presented as the top lanes respectively. Other possible actors in different learning settings are administrative roles (e.g. HEI Administration, Company HR Department), Facilitative roles (e.g. Content Provider, Education Provider) and the technological systems and repositories (e.g. OICS, a HEI administrative system, a company HR system, Content Provider content repository) that provide the facility to store and process the assets (Learning Outcome Definition, Shareable Educational Resource, Assessment Resource, Learning Outcome Profile) needed and providing the services to assist for the human roles in their activities. Figure 6 displays the Pool structure. Extra lanes will be added when external stakeholders participate in the process.

Figure 6. Pool Structure, Actor Names and Aliases

A Scenario Context can be represented as a set of lanes specific to the scenario, with the lanes of the actor aliases active in this scenario.

Figure 7. Scenario Pool Example: Re-skilling Scenario

4.1.2 Main Process Areas The pool structure gives a frame to depict the processes by area of responsibility. The concept of process owner can be used to point out who manages the resource used to carry out the activities and is for the main part responsible for the process outcome. There are, of course, multiple interactions between lanes, i.e. processes and their owners, according to the interdependencies in carrying out the activities and producing the results. However, the idea of process owner can be used as a technique to logically group processes and activities together into a meaningful structure.

28/92

Page 29: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

The main process areas are activities commonly found in organizing higher education activities depicted at a high abstraction level (“0-level”). Individual organizations follow variegated processes and procedures, therefore these activity areas are presented in the IRM as process areas. These areas link the IRM specific processes, described in the following sections, to the worlds of the stakeholders, and also the IRM processes logically together. The 0-level areas give the starting points to associate the activities with the correct actor or group of actors (see Figure 8).

Figure 8. The IRM main process areas, 0-Level

29/92

Page 30: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

4.2 Primary Processes

4.2.1 Learner Processes At the next, the level one process comprises the core activities of the actor in each lane. The Learner is involved in

- Planning Learning4 - Learning (and assessment) - Managing personal accounts (of learner data), where the Learner can manage besides

personal details also e.g. their achievement profile. The Learner profile (Personal Achieved Learning Outcomes, PALO) management is relevant both at the beginning and at the completion of the learning process. In the beginning, profiles (both targeted and completed) are used for planning learning. This activity breaks down to the tasks of searching Learning Opportunities and gap analysis, which is conducted against the existing profile (PALO). At the completion of a learning process, the PALO instance is updated.

Planning Learning

Learning and Assessment

Managing personal accounts

Assessment process

DiagnosticAssessment

Summative Assessment

FormativeAssessment

Post learner work for

assessment

Manage Personal Achieved Learning Outcomes

Self-assessment

Peer assessment

Update

Create

Figure 9. Learner’s Processes If the learning process is managed by a Higher Education Institution, the learner is maybe not the owner of processes like Planning Learning (including planning of curricula, teaching method selection and other related activities). However, if the learning process is not institution-managed, but the learner participates e.g. in professional development (cf. the Scenario example in Figures 5 and 6), part of the planning and management responsibility is transferred to the learner.

30/92

4 The processes could be identified with numbers: 1. Learner Processes, 1.1 Planning Learning etc., but this identification is not yet followed through in this document, since this IRM element is under rapid evolution. 

Page 31: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

The Professional Development Scenario from Section 3.1, illustrated in the following figures is divided to a first phase (Figure 10) where the Learner is analyzing his current profile of achievements and the opportunities to develop their competencies, and continues to plan a learning effort. The actual learning process and the related assessment activities are depicted in the second phases depicted in Figure 11.

Figure 10. Learner-centered Scenario Figure 11 follows the scenario with learner led approach to the further phases of actual learning and achievement profile updating (See Figure 11). In this scenario example, learning and assessment processes are depicted as interactive, separate recurrent activities. For elaboration of the assessment area processes see further below. The model leaves open at which phases of the learning effort assessment is taking place.

31/92

Page 32: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Figure 11. Learner-centered Scenario, Phases 2 and 3, Learning, Assessment and PALO Management

4.2.2 Learning Facilitator Processes The Learning Facilitator (a.k.a. Teacher, Tutor, Assistant, Learning Supporter) main process areas are presented in Figure 12. Supporting learning (Facilitating learning) is an iterative activity. Noteworthy is also, that the Assessment process can be recurrently interacting with the Supporting Learning process. This is illustrated in a later section on Assessment.

Figure 12. Learning Facilitator Process areas and processes

32/92

Page 33: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

It is very much organization specific if and how much the Learning Facilitator participates in the Administration process area. Very often, however, it is the case that in an institutional setting, a teacher also manages the learner records of achievements and other learner data, which is why this area is included here. Planning learning breaks down into processes of Content Selection, Method Selection and Tool Selection. The Learning Facilitator participates possibly also in the activities of Context Analysis and as joint activity with the administration in Learner data management. The following example (Figure 13), from the University of Vienna prototype processes, explicates the preparation of a Learning Design by enriching it with a Learning Outcome description as part of the Facilitator activities.

+

Search and Retrieve

Learning Design+

Update Learning Outcomes of LD

Upload enriched LD to OICS

Enriched Learning Design

Search and Retrieve UoL (Sub-process expanded)

Enter Search Query and Filter

Browse Query ResultsSend Query

Select LDView LD Details

X

X

Retrieve Learning Design

Update Learning Outcomes of LD (Sub-process expanded)

Add new Learning Outcome

Remove existing Learning Outcome

Edit existing Learning Outcome

Select existing Learning Outcome

X

X+

Figure 13. Sub-Processes of Learning Facilitator, case Preparing Learning Designs

33/92

Page 34: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

There are many variations of the Learning Facilitator (Teacher, Assistant, Learning Supporter) activities. There may be different portion of, or no planning activities. The actual tasks in learning supporting may vary, and the assessment may take many forms (see Assessment process area below).

4.2.3 Institution-Led Processes Process areas related to the HEI management and administration activities, with sub-areas (see Figure 14) depict activities within an institution managed learning scenario. Planning may be taken over to a great extent by the institution.

Figure 14. Institutional Processes (Strategy, Management, Administration)

4.2.4 Assessment Processes The processes are grouped according to the main actor carrying on the process. Assessment processes are of special interest here, firstly, because through assessment. learning outcomes are validated. Secondly, with evolving new approaches to pedagogy, such as problem based learning or project learning, assessment may take variegated forms diverging from the question-answer based assessment methods. Assessment process is always carried by the learner, but, if the outcomes are to be verified, also by another party, e.g. Learning Facilitator. In some case, assessment can be carried out by a system, if automated tests are made available, and Learner authentication to the system is strong enough to connect the outcome to the individua. Assessment activities may take place either sequentially, as pre-, inter- and post-study assessment (diagnostic, formative, summative assessment, cf. the FREMA framework).

34/92

Page 35: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

35/92

There are, however, other than test or examination based (question-answer) types of assessment (Dochy, 1997, Yorke 2001). In such cases, the assessment is based e.g. on some learner created content, (Learner exercise reviews, performance observation, collecting student work portfolio etc.). Part of a teaching method may also be peer and self assessment. In the following figure, the assessment process has variable tasks that take into account variegated assessment procedures. Actors involved here are Learner and Learning Facilitator.

Figure 15. Assessment Process Activities

4.2.5 Content Provider Processes A Content Provider participates in the activities

- Planning Content - Preparing Content - Publishing Content.

These will be further elaborated in the coming work on IRM.

4.3 Prototype Process Examples

The following pages show examples of technology-supported processes by taking flow diagrams of IRM prototype implementations (see Table 2). NB! These models were developed by different teams and before the domain model harmonization. This is why at several occasions in these models alias names of entities are in use. Since the flow diagrams describe a process executed by an application, they are partly at the procedure level of detail.

Page 36: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

4.3.1 Learning Facilitator Process Support: Design a Course (WUW Prototype)

36/92

Page 37: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

37/92

4.3.2 Learning Facilitator Process Support: Add Learning Outcomes to Lesson (WUW Prototype)

Page 38: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

38/92

4.3.3 Import Learning Designs in Elgg, Get Recommendations

Facilitator

Leaner

elgg VLE

search UoL

import UoL

New Course

Existing Course

invite students

execute UoL

support learner

read recomendations

view LOP

Page 39: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

39/92

4.3.4 Learning Design use sub-processes (Giunti prototype)

Page 40: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

40/92

4.3.5 Assessment: Assessment Planning, Assessment Activities

Learning Assessment Activities

Visualise Assessment

Answer Assessment

Submit Assessment Response

Visualise Assessment

Results

Delivery AppraiseGenerate

Assessment Results

Page 41: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

41/92

4.3.6 Publish Personal Achieved Learning Outcome in a third party environment (IMC Prototype)

IRM

Poo

lLe

arni

ng F

acili

tato

rLe

arne

rR

epos

itory

OIC

S

Page 42: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

42/92

4.3.7 Example of services rendered by OICS for learning processes (UMU Prototype)

Page 43: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a

43/92

Teacher recommendations(Widget)

View recommendations(teachers with similarlearning outcomes)

Register at OICS

Widget instances

Enter username andpassword to account

at OICS

Yes

Account at OICS?

Get widget instance

No

Yes

Widget instanceexists?

No

Widget instance now exist for user

Get recommendations(teachers with similar learning outcomes)

Woo

kie

Wid

get

Ser

ver

OICS User Service OICSReccomentadion

Service

Moo

dle

(with

Woo

kie

plug

in)

Teac

her

Sys

tem

Create user account

Get recommendations

OICS LOP Service

Get users LOUs(used as argument to the Recommendation service)

ICOPER Reference Model Specification Draft

4.3.8 Recommendation service: Example 2 of services rendered by OICS for learning processes

Page 44: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

5 Service Descriptions

5.1 Design Principles for Service Descriptions

The IRM is based on a service-oriented architecture (see Section 2). According to SOA Principles (Soa Systems Inc. 2009b), the following design principles guide the service-orientation paradigm:

• Standardized Service Contracts: While designing a service’s public interface, a contract guarantees that endpoints are consistent, reliable and governable. For the IRM, we specified these contracts in the form of the Middle Layer API described below.

• Service Loose Coupling: A service contract should not be coupled to a single implementation and to specific types of service consumers. In the IRM development we contribute to this objective by an implementation of a plethora of prototypes.

• Service Abstraction: emphasizes the need to hide as much of the underlying details of a service as possible. Whereas in the Middle Layer API, we honored this principle by defining functions that return only the required attributes.

• Service Reusability: Services have to be defined independently from specific concerns (e.g. context of use) in order to be useful to multiple service consumers. In the process of alignment of the Middle Layer API with other parts of the IRM we achieved reusability by taking into account all aspects in the domain of outcome-based learning.

• Service Autonomy: Services should have control over the logic they encapsulate in order to control their reliability. In the Middle Layer API this principle was implemented by reducing dependences between services per design.

• Service Statelessness: A stateless service does not retain state between consecutive invocations. It is therefore able to scale better. The Middle Layer API is designed truly stateless.

• Service Discoverability: Services are meant to be described and published in a way that multiple implementations can compete and that service consumers can discover them and bind to them dynamically. At this stage service discoverablity is not part of the IRM.

• Service Composability: Services should be easily combined into more complex services. We explore this principle through the more complex processes requiring multiple services.

5.2 Service Types

Based on the prototype developments driven by the Open ICOPER Content Space (OICS) as described in D1.2 we introduced the following service types. Search and Retrieval Service These services provide access to lists of Shareable Educational Resources of specific types (e.g. via getTeachingMethods one receives a list of Teaching Methods), and supports filtering based on simple query expressions. List items provide at least title and identifier. The getMetadata method returns the metadata record for an object. Publication Service

44/92

Page 45: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

These services permit publication of new resources into a collection stored in a Repository or Referatory. The service publishes a service document that describes for each collection the types of objects it accepts. User Management and Learning Outcome Profile Service These provide basic authentication, authorisation, and identity management services by granting access to a user, permitting grouping of users into groups and manipulating learning outcome profiles. Recommendation Service These services list users whose learning outcome profiles contain a certain learning outcome. More sophisticated algorithms based on additional information can be exposed by this service. Harvesting Service These services allow content providers to selectively export metadata records while allowing consumers of the service to build value-added services based on this metadata. The OICS harvests these metadata descriptions, and makes them available for outcome-oriented learning services. The harvester application used in the OICS is based on OAI-PMH (Open Archives Initiative) and makes use of the registry service in order to find repositories eligible for inclusion into the OICS, and of the validation service in order to make sure that ingested resources comply with the ICOPER LOM application profiles. Registry Service The OICS reuses the ARIADNE registry service that is currently being developed in the context of the ASPECT project. As explained in (Aspect D2.2), this service provides a catalogue of up-to-date information about learning object repositories (LORs). It provides the information necessary for systems to be able to select the appropriate protocols such as OAI-PMH, SQI, SPI, SRU/SRW supported by a given learning object repository. The OICS makes use of this registry of learning object repositories to find information for example characteristics of the repositories involved like contact persons, names, URL, etc. In addition the OICS also learns about the characteristics of the content, the metadata schemes they use to describe their contents and associated rights as well as protocols, standards and specifications they offer to provide access to the content. Validation Service To ensure that only compliant metadata are stored in the OICS, we use the ARIADNE validation service to check both the syntactic and semantic validity of the instances against the data schemas in use. Identification Service All resources managed by the OICS are assigned a unique, persistent identifier. This service creates an identifier, which retains any information necessary for resolving the identifier to an OICS resource.

45/92

Page 46: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

5.3 The OICS Middle Layer API

The OICS middle layer tries to bundle the requirements of ICOPER together in a coherent API that is accessible from all systems and tools not only for the use of the project consortium but also the broader community of interestees. It should serve as a proof-of-concept to enable validation of the work done by the different WP and prototype teams during the project. OICS Middle Layer API will further evolve until the end of the project. Some of the method labels still need to be synchronized with the latest version of the IRM Concept Map (e.g. UnitsOfLearning is used instead of Learning Design). User Management and Learning Outcome Profile Service

Name listUsers

Description Lists all users registered at the OICS - might be restricted to system admins for privacy protection.

Input parameters

Output List of users with names and pointer to detailed information

Name createUser

Description Creates a new OICS user. OICS users can publish annotations to OICS objects and store learning outcome profiles.

Input parameters Email, first_names, last_name, password

Output userId (string)

Name getUser

Description Returns all information about the user that is available for an authenticated and authorized user.

Input parameters userId

Output List of property/value pairs; values can be pointers to a representation of the property (for example profile)

Name modifyUser

46/92

Page 47: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Description Modifies a property of the user.

Input parameters

property name, value The following properties should be supported: email, first_names, last_name, email_privacy_level (if set to 1, user's email is shown to other users)

Output void

Name createGroup

Description Creates a group representing a learning context with users’ assigned roles (teacher and learner).

Input parameters GroupName

Output groupId (String)

Name addUserToGroup

Description Adds a user to a group with a given role (teacher or learner).

Input parameters userId,groupId,Role

Output void

Name getGroup

Description Lists all members of a group with their respective role).

Input parameters groupId

Output List of tuples (userId,role)

Name addAchievementToLearningOutcomeProfile

Description adds an achievement information to a user's learning outcome profile (the achievement is linked to a learning outcome, assessment record and context)

Input parameters

userId identifier (optional) title (optional)

47/92

Page 48: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

description (recommended) related learning outcome (recommended) context (recommended) assesment record (recommended) Note: If no identifier is submitted it will be generated. The title and description of an achievement maybe similar to title and description of the attained learning outcome of the achievement

Output identifier

Name addAssessmentRecordToLearningOutcomeProfile

Description adds information of an Assessment record that is an evidence of an achievement (the assessment record is linked to an achievement)

Input parameters

userId identifier (optional) type (mandatory) title (mandatory) score (optional) date (mandatory) assessingBody (mandatory) description (recommended) attachedReference (optional) levelOfAssessmentResult (scheme, value) (optional) Note: If no identifier is submitted it will be generated.

Output Identifier

Name getLearningOutcomeProfile

Description Retrieves the learning outcome profile for a user.

Input parameters userId

Output String

Recommendation service

48/92

Page 49: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Name recommendUsers

Description Looks for users whose learning outcome profiles contain all or any of the provided learning outcomes using matching criteria.

Input parameters

list of objectIdentifiers for learningOutcomeDefinitions comparisonOperator: (optional) AND (default) or OR

Output List of userIds

Search and retrieval service Filters specify a field in the metadata schema, the query should be applied to. Multiple queries can be combined. These methods return only minimal information about an object, the complete description can be retrieved by calling getMetadata.

Name getLearningOutcomeDefinitions

Description This method returns a list of learning outcome definitions stored in the OICS. Title and identifier are provided for each object.

Input parameters

filter: (optional) query: (optional) pageNumber: (optional)

Output List of tuples (title and identifier)

Name getTeachingMethods

Description This method returns a complete list of all teaching methods stored in the OICS.

Input filter: (optional) query : (optional) pageNumber: (optional)

Output List of tuples (title and identifier)

Name getUnitsOfLearning

Description This method returns a complete list of all Units of Learning stored in the OICS.

Input parameters

filter: (optional) query: (optional) pageNumber: (optional)

49/92

Page 50: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Output List of tuples (title and identifier)

Name getLearningResources

Description This method returns a complete list of all resources stored in the OICS.

Input parameters

filter: (optional) query: (optional) pageNumber: (optional)

Output List of tuples (title and identifier)

Name getLearningAssements

Description This method returns a complete list of all assessment resources stored in the OICS.

Input parameters

filter: (optional) query: (optional) pageNumber: (optional)

Output List of tuples (title and identifier)

Name getMetadata

Description

This method returns the metadata record for an object stored in the OICS. The method returns the "default", "objective" metadata record for an object. If multiple metadata records for an object are stored in the OICS, an optional metadata identifier can be provided as input argument.

Input parameters

objectIdentifier metadataIdentifier: (optional)

Output String

Publication service

Name submitObject

Description Adds a new object to the OICS (learning design, teaching method, learning resource, learning outcome definition), or updates an existing one.

Input parameters

resource: a binary representation of the object collection: (optional) identifier of the collection where the object should be added objectIdentifier: (optional) if no objectIdentifier is provided, this services

50/92

Page 51: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

generates one (or extracts one from the binary representation of the object) and returns it. If an objectIdentifier is provided and it already exists in the system, the object will be updated

Output objectIdentifier

Name submitMetadata

Description

Submits a metadata record. The status of the record depends on the configuration of the service, and of the collection, and on the privileges of the user, i.e. in some context the record might be the "objective" metadata record, provided by an accredited source, in some context it might be contributed by a user or by a community.

Input parameters

objectIdentifier: (optional) if no objectIdentifier is provided, this services generates one (or extracts one from the metadata record). If an objectIdentifier is provided and it already exists in the system, the metadata record will be associated with this object. If this object already has metadata associated, it will be replaced metadataRecord: String

Output objectIdentifier

Name submitEnrichment

Description

Enrichment here refers to providing additional information to an existing metadata record. If no record exists in a given context, a new empty one,can be instantiated by the service. The metadata fragment should be wrapped into a LOM container, but special input formats for specific fields (for example annotations or relations) can be supported by implementations.

Input parameters

objectIdentifier metadataFragment: String

Output metadataIdentifier

Name submitComment

Description Submits a textual comment.

Input parameters

objectIdentifier Comment: String (can be of type html or xhtml)

Output objectIdentifier

51/92

Page 52: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Name submitRelation

Description Via this method, two objects can be linked, and if possible, the metadata record for both of them will be updated to reflect their relation.

Input parameters

identifier1 identifier2 linkType: String (must be value from relation.kind vocabulary of ICOPER LOM profile)

Output void

Name deleteObject

Description Removes an object from the OICS.

Input parameters objectIdentifier

Output void

52/92

Page 53: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

6 Data Schemas

Sharing common data via common data schemas is a widely used approach to achieve interoperability. In particular in the field of TEL, this approach is widely adopted. In ICOPER we have analysed existing approaches and synthesized them – together with the IRM Domain Model – into the first version of a common data model. The Data Schemas of the IRM aims at providing rich but simple data models to exchange data between TEL-related systems in order to support outcome-oriented teaching and learning. The starting point for the IRM Data Schemas is the IRM Domain Model (see Section 2). We have developed initial data models for the key concepts based on good practice and existing learning technology standards and validated them through prototypes. The goal of this section is to present data schemas that have already been validated by proof-of-concept implementations through prototypes (see Section 2). The following data schemas represent sub-domains of the IRM Domain Model.

6.1 Meta Model for Data Schemas

The IRM documentation of data models will: 1. Be concise and refer to detail given elsewhere 2. Use common conventions in presentation in order to be accessible 3. Be capable of representing data models, that are adopted or profiled from various

sources with differing styles, in a consistent format 4. Allow for existing and new models to be blended together 5. Not presume how the detail of specifications and standards should be documented

(now or in the future) 6. Be easily understood by itself and in relation to the other parts of the IRM

Summaries are chosen for documentation as the best way of accommodating the above requirements. Two forms of presentation should be used: tabular and diagrammatic. These are expressly summary views and not specifications. Similarly: encodings (or “bindings”) are a separate resource to the summaries. Separate summaries will be created for each of the building blocks of the IRM, assumed to align mostly with the IRM key concepts (see Figure 2). The building blocks will be further aligned to key service interactions in the IRM. This means that identical chunks of data model, representing commonly-used concepts, may be used in more than one summary. The summaries are representative of as-exchanged data structures. Standard table structures, showing some fragments of a data model will be provided.

53/92

Page 54: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Table 4. Data Model Table Structure Column Note

Id An identifier for the element. Each summary has its own value space for Id. Use numbers 1-9 Hierarchy in the source specification should be shown using format e.g. “1.2” Hierarchy not in the source specification may be shown in the interests of intelligibility but the use of at most 2 levels of structure and pointers is preferable to deep hierarchies.

Name A simple name, identical to that used in the source standard/specification or icoper deliverable. i.e. it matches an element in the document given in the “Reference” column.

Multiplicity Limited to5: 0..1, 0..*, 1 or 1..*

Description A brief description of the intended meaning of the element.

Reference A reference to the source specification6 and/or ICOPER deliverable where a full description is available (if this differs among elements in the summary).

Type Generally blank except where the data type is simple, when a generic7 data type is given: String, URI8, Date/Time, Number, … Consider references to members of a controlled vocabulary to be a type named “Vocabulary Term”9. Consider pairings of a string and its language to be a type named “LangString”.

Same As A cross-reference to another IRM data model summary and specific element that is identical. This shows where commonly-used concepts are occurring in different places.

Notes May include: • An example or reference to a specific instance of use • Suggested vocabulary (refer to the source spec or ICOPER deliverable) • An indication that use is generally recommended even though the element

is optional

Where fragments of existing specifications are incorporated, or where existing specifications are heavily profiled, the above approach is viable. Where an existing specification is adopted in totality in the IRM or where whole structures are imported into the IRM, duplication or

5 Choice of “0..*” vs “0..n” represents UML practice. 6 It must the clear that the source specification is definitive. 7 The way types are used and value spaces controlled is subject to quite a lot of variation in practice so a “generic” approach is suggested, recognising that detail is omitted. For well‐known structure types ‐ e.g. 

d. Where it is considered important to give detail, try to do this as a traint on a generic type. 

“vcard” ‐ they should be stateclarification/specialisation/cons

54/92

8 NB: URI is considered a type 9 This may be equivalent to a URI but existing practice does not always use URIs. This is glossing over considerations of abstract model and should be understood as a reference to an identified conceptual entity. 

Page 55: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

reformatting is not appropriate and reference to the existing work should be preferred. In these cases the tabular presentation should be adapted to either:

1. (for adoption in totality) show only the top 1 or 2 levels of structure10

2. (for adoption of a structure) show only the top level of the adopted structure

In each case, no deeper sub-structure should be shown but reference made to the original specification/standard. For diagrammatic representations UML class diagrams will be used with the following restrictions:

1. Use only the generic “association” relationship type and not the specialised “aggregation” and “composition”

a. Generally provide a label for each association relationship, choosing names consistent with the ICOPER Conceptual Model when sensible

b. Use directional associations to match the sense of the label c. Show multiplicities (limited to: 0..1, 0..*, 1 or 1..*)11

2. Show properties only, no methods a. With multiplicity b. With data type, consistent with the tabular presentation c. Omit properties that equate to a association relationships

3. Show no visibility (public/private access) designation or if this is forced by your modelling tool, show “public”

4. Avoid introducing specialisation relationships that are not present in the source specification

5. Avoid stereotype designations, ordering and other decorations 6. When incorporating structures from existing standards, adopt a similar strategy to that

stated for tabular presentation and include a UML note to explain 7. Where a class appears in more than one data model (i.e. “same as” would be used in

the tabular presentation), add a UML note to explain

6.2 Personal Achieved Learning Outcomes

Personal Achieved Learning Outcomes (PALO) summarizes data schemas for the following key concepts (see Figure 16 and Table 5):

- Learning Outcomes - Achievement - Assessment Record - Personal Achievement Profile - Context

10 ata model elements  that are the  For non‐hierarchical data models, view the “top” level as being those dbridge from the rest of the IRM into the adopted‐specification. 

11 UML tools differ in support so alternatives are acceptable if necessary 

55/92

Page 56: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Figure 16. Personal Achieved Learning Outcomes – Diagrammatic Representation Table 5. Personal Achieved Learning Outcomes – Table Representation

No Name Multi-plicity

Description Value Space

Data type Obli-gation

Example

1 Personal Achievement Profile

0..n This element represents a collection of one’s personal achievements.

M

1.1 Holder 1 A human readable title of the personal profile.

LangString M (“en”, “Peter Smith’s Learning Outcomes”)

1.2 Identifier 1 Primary URI that is used to access the profile.

URI CharString M “http://www.wu.ac.at/students/psmith”

1.3 Description

0..1 A human readable description of the learner profile. The Description may be repeated in multiple languages.

LangString O (”en”, “Learning outcomes obtained by Peter Smith at WUW”)

56/92

Page 57: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

No Name Multi- Description Value Data type Obli- Example plicity Space gation

1.4 Achievements

0..n This element is a reference to the achievements represented in this profile.

O

2 Achievement

0..n An achievement record, normally, of an attained learning outcome. Information about the achievement may be taken directly from a related learning outcome, rather than being given particularly. Personalised versions of title and description may be used to supplement learning outcome.

O

2.1 Identifier 1 A globally unique label that identifies the achievements in the individual profile.

URI CharString M “http://www.wu.ac.at/students/psmith/achProf/ach1.A01”

2.2 Title 0..1 A text label for the achievement. LangString O (“en”, "project management")

2.3 Description 0..1 A human readable description of the achievement. This is a personalised text of the learning outcome definition. But can also be similar to learning outcome; defined by a taxonomy of learning outcomes.

LangString

R (“en“, “Prepared and managed a software development project”)

2.4 Related Learning Outcome

0..n Identifier of the learning outcome that this achievement claims to have attained.

URI CharString O “www.iCoper.org/LODTax/LOD_Id-1.A.1.a.2”

2.5 Context 0..n Identifier of the context where the achievement is claimed to be attained.

O

2.5.1 Scheme 1 A reference to the definition or the schema used to describe the context values.

URI CharString M “http://lre-thesaurus.eun.org/”

2.5.2 Value 1 A label value/term of the context. Vocab. M “management”

2.6 Assessment Record

0..n Identifier of the assessment record that stands as evidence to the achievement.

URI CharString O “www.iCoper.org/assRec/Ass003”

3 Learning Outcome

0..n The learning outcome (knowledge, skill and competence) that is attained by the learner.

O

3.1 Identifier 1 A globally unique label that identifies the learning outcome (Knowledge, Skill and Competence). The Identifier is sufficient to reference the learning outcome definition in any other system.

URI CharString M “www.iCoper.org/LODTax/LOD_Id-1.A.1.a.1”

3.2 Title 1 A single mandatory text label for the learning outcome. This is a short human-readable name for the learning outcome. The Title may be repeated in multiple languages. Each translation is represented by an instantiation of LangString type. The identifier provides the definitive reference to the learning outcome. The title element provides a convenient, alternative, readable form.

LangString

M (“en”, "IT project proposals writing.") (“de", “Schreiben von IT-Projektanträgen.”)

57/92

Page 58: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

No Name Multi- Description Value Data type Obli- Example plicity Space gation

3.3 Description 0..1 A human readable description of the learning outcome. This is an optional unstructured (opaque) “text blob” meant to be interpretable only by humans. The Description may be repeated in multiple languages.

LangString R (“en“, “Able to design an IT project proposal.”)

3.4 Type 0..1 An element that captures the type of learning outcome, according to the European Qualification Framework (EQF).

knowledge, skill, competence

Vocab.

R “skill“

3.5 Related Learning Outcome

0..n Captures information of other learning outcomes that may be related to the current described learning outcome.

O

3.5.1 Reference 1 A globally unique label that identifies the learning outcome (knowledge, skill or competence). The Identifier is sufficient to reference the learning outcome definition in any other system.

URI CharString M “www.iCoper.org/LODTax/LOD_Id-1.A.1.a.3”

3.5.2 Relationship Type

1 This element captures the type of relation between the current described learning outcome and another learning outcome. Examples are the SKOS relations: http://www.w3.org/TR/skos-reference/ - like narrower, broader.

URI

CharString

M “http://www.w3.org/TR/skos-reference/#narrower”

3.6 Context 0..n Identifier to the context of the learning outcome.

O

3.6.1 Scheme 1 A reference to the definition or the schema used to describe the context values

URI CharString M “http://lre-thesaurus.eun.org/”

3.6.2 Value 1 A label value/term of the context. Vocab. M “management”

3.7 Level 0..n Identifier to the definition or the schema used to describe the level values

O

3.7.1 Scheme 1 A reference to the schema used to describe the qualifier level values.

URI CharString M “http://ec.europa.eu/education/policies/educ/eqf”

3.7.2 Value 1 Represents a numeric value for the level. Vocab. M “6”

4 Level 0..n A set of metadata elements that capture ranking information about the learning outcomes of learners. This includes proficiency level, interest level, weight, ageing.

R

4.1 Name 1 Capture the genre/name of the ranking. It can capture the proficiency level of the learning outcome, the learner interest in obtaining the outcome or the ageing of the outcome. Some learning outcomes may degrade by time, like language skills.

proficiency level, interest level, ageing, weight

Vocab. M “proficiency level”

58/92

Page 59: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

No Name Multi- Description Value Data type Obli- Example plicity Space gation

4.2 Value 1 Capture a numeric value for the level. Example: the eight EQF levels of proficiency.

Vocab. M “6”

4.3 Scheme 1 A reference to the definition or the schema used to describe the level values.

URI CharString M “http://ec.europa.eu/education/policies/educ/eqf”

4.4 Description 0..1 A textual description about the level described. This element is useful to be provided when a level value provided is not part of a common ontology or taxonomy.

LangString

O (“en”, “advanced skill”)

5 Context 0..n A set of factors that are external to and give meaning to a learning outcome. For instance subject domain and location (e.g., lab, classroom) are textual information that gives meaning to the learning outcomes.

R

5.1 Value 1 A label value/term of the context. Vocab. M “management”

5.2 Scheme 1 A reference to the definition or the schema used to describe the context values.

URI CharString M “http://lre-thesaurus.eun.org/”

5.3 Description 0..1 A textual description about the context domain. This element is useful to be provided when a context value/term provided is not part of a common ontology or taxonomy.

LangString

O (“en”, “ELR multilingual thesaurus”)

6 Assessment Record

0..n Captures information of evidence that a learner has obtained a learning outcome. This record constitutes of evidence of the verification of the attainment of a certain achieved learning outcome by a certain learner. Thus, assessment records allow to associate learners and learning outcomes, in a formalised way. Apart from the learner data and learning outcome data, an assessment record provides information about the type of test performed for verifying the achieved learning outcome, also the responsible expert or institution who endorses it.

R

6.1 Identifier

1 A globally unique identifies to the assessment record The Identifier is sufficient to reference the assessment record in any other system.

URI M “www.iCoper.org/assRec/Ass003”

6.2 Type 1 Captures the form of evidence accepted as a formal proof of the attainment of a learning outcome.

certificate, license, official record, self-assessment

Vocab. M “certificate”

6.3 Title 1 Provides a readable description of the assessment record.

LangString

M (“en”, “certificate of IT project management”)

59/92

Page 60: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

No Name Multi- Description Value Data type Obli- Example plicity Space gation

6.4 Score 0..1 A numeric value that represents a result/grade of an assessment of certification.

CharString R “10”

6.5 Date 1 The date the evidence record is created. ISO8601

DateTime M "2009-09-21T02:13:00"

6.6 Assessing Body

1 The name or a reference to the agent (e.g. university) that verifies the assessment record. vCard, as defined by IMC vCard 3.0 (RFC 2425, RFC 2426)

Vcard

M “BEGIN:VCARD VERSION:3.0 ORG:WU Vienna TEL;TYPE=WORK,VOICE:(111) 111-1111 ADR;TYPE=WORK:;; Augasse 2-6;Vienna; Austria EMAIL;TYPE=PREF,INTERNET:[email protected] END:VCARD

6.7 Description 0..1 A textual description about the assessment value. This element captures the feedback type of assessment where no scores for the assessment are provided.

LangString

O (“en”, “The student demonstrated ability in designing, implementing and managing a software development project.”)

6.8 Attached Reference

0..n A reference to any attachments that prove the obtaining of learning outcome and the evidence record

URI CharString O “www.univxxx.edu/diplomas/1234.pdf”

6.9 Level of Assessment Result

0..1 Identifier to the definition or the schema used to describe the level values.

O

6.9.1 Scheme 1 A reference to the schema used to describe the level values.

URI CharString M “http://ec.europa.eu/education/policies/educ/eqf”

6.9.2 Value 1 Represents a numeric value for the level. Vocab. M “6”

6.3 Learning Design Data Model

The core learning design related concepts in the overall IRM concept model are Learning Design and Teaching Method, along with their related concepts. The construction and definition of the set of elements that may be used to describe a teaching method and a learning design are covered in D3.1. They are represented in the following UML conceptual model.

60/92

Page 61: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Learning Design ResourceLearning Design Resource

title: stringauthors: string[*]licensingModel: stringsummary: stringrationale: stringsubject: string[*]learningOutcomes: string[*]groupSize: stringduration: stringlearnerCharacteristics: stringsetting: EducationalSettinggraphicalRepresentation: PicturesequenceOfActivities: stringroles: string[*]resources: stringliteratureReferences: string[*]

title: stringauthors: string[*]licensingModel: stringsummary: stringrationale: stringsubject: string[*]learningOutcomes: string[*]groupSize: stringduration: stringlearnerCharacteristics: stringsetting: EducationalSettinggraphicalRepresentation: PicturesequenceOfActivities: stringroles: string[*]resources: stringliteratureReferences: string[*]

Learning Design ResourceLearning Design Resource

title: stringauthors: string[*]licensingModel: stringsummary: stringrationale: stringsubject: string[*]learningOutcomes: string[*]groupSize: stringduration: stringlearnerCharacteristics: stringsetting: EducationalSettinggraphicalRepresentation: PicturesequenceOfActivities: stringroles: string[*]resources: stringliteratureReferences: string[*]

title: stringauthors: string[*]licensingModel: stringsummary: stringrationale: stringsubject: string[*]learningOutcomes: string[*]groupSize: stringduration: stringlearnerCharacteristics: stringsetting: EducationalSettinggraphicalRepresentation: PicturesequenceOfActivities: stringroles: string[*]resources: stringliteratureReferences: string[*]

implements

0..*

Learning Design

Learning DesignLearning Design

Learning Design

TeachingMethod

TeachingMethodTeachingMethod

TeachingMethod

has variation

CommentComment

type: CommentTypecontent: stringtype: CommentTypecontent: string

CommentComment

type: CommentTypecontent: stringtype: CommentTypecontent: string

0..*has

«enumeration»EducationalSetting

«enumeration»EducationalSetting

face-to-faceonlinedistantblended

face-to-faceonlinedistantblended

«enumeration»EducationalSetting

«enumeration»EducationalSetting

face-to-faceonlinedistantblended

face-to-faceonlinedistantblended

«enumeration»CommentType

«enumeration»CommentType

teacher reflectionstudent feedbackpeer review

teacher reflectionstudent feedbackpeer review

«enumeration»CommentType

«enumeration»CommentType

teacher reflectionstudent feedbackpeer review

teacher reflectionstudent feedbackpeer review

0..* 0..*

Assessment Method

Assessment Method

Assessment Method

Assessment Method

Learner Assessment

Learner Assessment

Learner Assessment

Learner Assessment

uses

implements 0..*0..*

0..*

0..*

Figure 17. Learning Design Data Model – Diagrammatic Representation The above data model was subsequently developed into the LOM Profile, which is used to describe learning design resources. The following transformations to the model have been applied:

• The LOM Profile primarily captures data about "Learning Design Resource" objects, i.e. learning designs, teaching methods, learner assessments, and assessment methods. Any valid LOM instance describing learning content is also a valid instance of the ICOPER LOM profile—that is, the profile can be used for other types of learning resources and objects.

• The relationships between these elements were captured in the Relation category of LOM; to support the description of the implements and uses relationships, the Relation.Kind vocabulary was extended with items referring to these relationships (in both directions), i.e. uses, is used by, implements, and is implemented by. In addition the "is variation of" vocabulary item allows capturing variations of learning design resources.

• The Comment class was captured in the LOM category 8:Annotation. To enable distinction of different types of comment, the Annotation category was extended with the 8.4:Annotation.Type element, which uses the following vocabulary: teacher reflection, student feedback, and peer review.

61/92

Page 62: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

• LOM category 5:Educational was extended with 5.12:Educational.LearningOutcome, which is a reference to a learning outcome definition (see WP2). The learning outcome may be qualified using a Level specification (5.12.2:Level) that assigns a value defined in a particular scheme to the learning outcome (e.g. level 5 in the EQF scheme).

• Several descriptive elements of the Learning Design Resource class were mapped to the LOM base schema as follows:

o title: mapped to 1.2:General.Title o authors: mapped to 2.3:LifeCycle.Contribute o licensingModel: mapped to elements of the category 6:Rights o summary: mapped to 1.4:General.Description o duration: mapped to 5.9:Educational.TypicalLearningTime o subject: mapped to 9:Classification with 9.1:Classification.Purpose =

"discipline" o learningOutcomes: mapped to LOM extension

5.12:Educational.LearningOutcome, which references a learning outcome definition, and optionally qualifies the intended learning outcome using 5.12.2:Level

o groupSize: mapped to 9:Classification with 9.1:Classification.Purpose = "group size" (extension of LOM base vocabulary)

o duration: mapped to 5.9:Educational.TypicalLearningTime o learnerCharacteristics: this elements includes a description of the “target

group” of this resource, i.e. the learners’ prerequisite knowledge, skills, competences, age, level within the curriculum, special attributes, and/or qualities.

The learners' age was mapped to 5.7:Educational.TypicalAgeRange The prerequisites were mapped to 9:Classification with

9.1:Classification.Purpose = "prerequisite" o setting: mapped to 9:Classification with 9.1:Classification.Purpose =

"educational setting" and vocabulary values: face-to-face, online, distant, or blended.

o sequenceOfActivities: mapped to 5.10:Educational.Description • Those descriptive elements that are part of the content of the object were not captured

in the LOM metadata. These include: rationale, graphicalRepresentation, roles, resources, and literatureReferences.

• Several additional relevant LOM elements were explicitly added to the ICOPER LOM Profile metadata:

o 1.1:General.Identifier, which identifies an entry in a particular catalog. o 2.1:LifeCycle.Version to enable provision of a versioning history for a

resource o 4.1:Technical.Format to identify the file format of the object o 4.2:Technical.Size to define the file size o 4.3:Technical.Location to point to the object's download and/or viewing

location(s) o 4.4:Technical.OtherPlatformRequirements, which can be used to provide one

of the PackageFormat vocabulary values, e.g. indicating whether the object is IMS LD or IMS QTI compliant. The language identifier of the LangString should be set to "x-t-icoper-packageformat"

62/92

Page 63: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

o 5.2:Educational.LearningResourceType; the vocabulary was extended with values referring to the particular type of learning design resource, i.e. teaching method, unit of learning, learning assessment, and assessment method.

o 5.6:Educational.Context, identifying whether the resource is suitable for higher education, training, school, or other contexts.

• All other LOM elements may still be used, but were omitted in this model since there is no specific ICOPER recommendation on their use; i.e., they are optional.

There are two conformance levels for the ICOPER LOM Profile:

• Basic conformance requires all mandatory elements to be present, while recommended elements may be omitted. The following table lists multiplicity constraints for this basic conformance level.

• Full conformance requires mandatory and recommended elements to be present, i.e. the lower-bound of the multiplicity of recommended elements becomes 1.

Table 6. Learning Design Data Model – Table Representation

Id Name Description Reference

Data type Multi-plicity

Notes and recommendations

1 General This category groups the general information that describes this resource as a whole.

LOM (CATEGORY) 1

1.1 Identifier A globally unique label that identifies this resource.

LOM (CONTAINER) 1..* Mandatory element

1.1.1 Catalog The name or designator for the identification or cataloging scheme for this entry. A namespace scheme.

LOM CharString 1 All resources stored in the OICS have an identifier with catalog set to “ICOPER”

1.1.2 Entry The value of the identifier within the identification or cataloging scheme that designates or identifies this resource. A namespace specific string.

LOM CharString 1 e.g. “UOL.univie.3989248”

1.2 Title Title of this resource. LOM LangString 1 Mandatory element. Example: (“en”, “Basics of academic writing”)

1.3 Language The primary human language or languages used within this resource to communicate to the intended user.

LOM CharString 1..* Mandatory element. ISO 639:1988 (langcode) and ISO 3166-1:1997 (optional subcode) Example: “en-US”

1.4 Description A textual description of the content of this resource.

LOM LangString 0..* It is recommended to give sufficient level of detail in the description of a learning resource in order to improve its discoverability and presentation to users. Example: (“en”, “Learners learn in three tasks how to write magazine stories for a target audience, in this case, for a teenage magazine.”)

63/92

Page 64: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Id Name Description Refere Data type Multi- Notes and recommendations nce plicity

2 Life Cycle This category describes the history and current state of this resource and those entities that have affected this resource its evolution.

LOM (CATEGORY) 0..*

2.1 Version The version of this resource. LOM LangString 0..1 Example: (“x-none”, “1.0”)

2.3 Contribute Those entities that have contributed to the state of this resource during its life cycle.

LOM (CONTAINER) 0..* It is recommended to provide at least one instance of this element, with 2.3.1 Role set to one of { author, publisher, content provider }, in order to give the user a chance to retrieve the person or institution holding IPR on the resource and to provide input to recommendation services.

2.3.1 Role Role of contributing entity. LOM { author, publisher, unknown, initiator, terminator, validator, editor, graphical designer, technical implementer, content provider, technical validator, educational validator, script writer, instructional designer, subject matter expert }

1

2.3.2 Entity The identification of and information about entities (i.e., people, organizations) contributing to this resource. The entities shall be ordered as most relevant first.

LOM CharString (vCard)

1..* Example: “begin:vcard\nversion:3.0\nfn:Michael Derntl\nn:Derntl;Michael\nemail;type=intern:[email protected]\nend:vcard”

4 Technical Technical requirements and characteristics of this resource.

LOM (CATEGORY) 1

4.1 Format Technical data type(s) of all components of this resource.

LOM CharString (MIME type)

0..* Example: “application/zip”, “application/msword”

4.3 Location Location of the resource (for download or viewing).

LOM CharString (URL)

1..* Mandatory element. Example: “http://test1.km.co.at/UOL/univie/3989248”

64/92

Page 65: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Id Name Description Refere Data type Multi- Notes and recommendations nce plicity

4.6 Other Platform Requirements

Information about other software and hardware requirements.

LOM LangString 0..* It is recommended to use this element in order to indicate the package format of the resource, if any. The package format is expressed using the LRE ILOX Manifestation Package Names defined in the Learning Resource Exchange Metadata Application Profile Version 4.5, p. 85. The language identifier to be used is "x-t-icoper-packageformat". Examples: ("x-t-icoper-packageformat", "imsqti_v1p2") ("x-t-icoper-packageformat", "imsld_v1p0")

5 Educational Key educational or pedagogic characteristics of this resource.

LOM (CATEGORY) 0..1

5.2 Learning Resource Type

Specific type of resource. The most dominant kind shall be first.

LOM extension

{ unit of learning, teaching method, assessment method, learning assessment }

0..* It is recommended to use this element, since it allows understanding how the resource can be made use of, without the need of inspecting the content/structure of the resource.

5.6 Context Principal environment in which this resource is intended to be used.

LOM { school, higher education, training, other }

0..* It is recommended to provide one instance of this element.

5.7 Typical Age Range

Typical age range of end users of this resource.

LOM LangString 0..* Part of the description of the learner characteristics for a resource, e.g. typical age of learners addressed by a teaching method. Example:. “18-“

5.9 Typical Learning Time

The estimated amount of time it takes to work with or through this resource when it is being used / implemented.

LOM Duration 0..1 For a learning design resource (i.e., a teaching method, assessment method, learning design or learning assessment), this element refers to the typical duration of an implementation of the resource in a concrete setting. For a content resource (e.g., a learning object) this would refer to the typical duration it will take the target audience to work with or complete the resource. Example: “PT1H30M”

5.10 Description Detailed description of the activities performed by the participants as part of working with or through the resource.

LOM LangString 0..1

65/92

Page 66: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Id Name Description Refere Data type Multi- Notes and recommendations nce plicity

5.12 Learning Outcome

References and qualifies the definition of an intended learning outcome aimed at by this resource.

(CONTAINER) 0..* It is recommended to provide a complete set of learning outcomes that are intended to be achieved through this resource.

5.12.1 Identifier A globally unique identifier that references a learning outcome definition.

(CONTAINER) 1

5.12.1.1 Catalog The name or designator for the identification or cataloging scheme for this entry. A namespace scheme.

CharString 1 Example: “ICOPER”

5.12.1.2 Entry The value of the identifier within the identification or cataloging scheme that designates or identifies this learning outcome definition. A namespace specific string.

CharString 1 Example: “OpenLearn:0059F4D4BC072E04DF5001D84BA9C”

5.12.2 Level A set of metadata elements that capture ranking information about the learning outcomes of learners. This includes proficiency level, interest level, weight, ageing.

PALO (CONTAINER) 0..*

5.12.2.1 Name Capture the genre/name of the ranking. It can capture the proficiency level of the learning outcome, the learner interest in obtaining the outcome or the ageing of the outcome. Some learning outcomes may degrade by time, like language skills.

PALO { proficiency level, interest level, ageing, weight }

1 Example: “proficiency level”

5.12.2.2 Value Capture a numeric value for the level. Example: the eight EQF levels of proficiency.

PALO Vocab. 1 Example: “2”

5.12.2.3 Scheme A reference to the definition or the schema used to describe the qualifier level values.

PALO CharString 0..1 Example referencing the EQF levels: “http://ec.europa.eu/education/policies/educ/eqf”

5.12.2.4 Description A textual description about the level described. This element is useful to be provided when the scheme element is not provided. In other words, when a level value provided is not part of a common ontology or taxonomy.

PALO LangString

0..1 Example: (“en”, “basic factual knowledge”)

6 Rights This category describes the intellectual property rights and conditions of use for this resource.

LOM (CATEGORY) 0..1

66/92

Page 67: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Id Name Description Refere Data type Multi- Notes and recommendations nce plicity

6.3 Description Textual description of IPR, licensing model, copyright information, etc. of this resource.

LOM LangString 0..1 For resources published under a Creative Commons License, it is recommended to make use of the “x-t-cc-url” token language identifier, as defined in the LRE 4.0 Application Profile. Example: (“en”, “Licensed under a Creative Commons BY-NC-SA 3.0 License.”), (“x-t-cc-url”, “http://creativecommons.org/licenses/by-nc-sa/3.0”)

7 Relation This category defines the relationship between this resource and other resources, if any. To define multiple relationships, there may be multiple instances of this category. If there is more than one target resource, then each target shall have a new relationship instance.

LOM (CATEGORY) 0..* Use this category to describe the following relationships: resource is variation of resource; learning design implements teaching or assessment method; learning design uses learning assessment; learning assessment implements assessment method; plus any relationships based on LOM base vocabulary

7.1 Kind Nature of the relationship between this resource and the target resource.

LOM extension

{ uses, is used by, implements, is implemented by, is variation of } or any of the LOM base values { is part of, has part, is version of, has version, is format of, has format, references, is referenced by, is based on, is basis for, requires, is required by }

1

7.2 Resource Target resource of this relationship. LOM (CONTAINER) 1

7.2.1 Identifier A globally unique label that identifies the target resource.

LOM (CONTAINER) 1..*

7.2.1.1 Catalog Name or designator of the identification or cataloging scheme for this entry. A namespace scheme.

LOM CharString 1 Example: “ICOPER”

7.2.1.2 Entry Value or identifier within the catalog. A namespace specific thing.

LOM CharString 1 Example: “TM.univie.3989445”

67/92

Page 68: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Id Name Description Refere Data type Multi- Notes and recommendations nce plicity

7.2.2 Description Description of the target resource. LOM LangString 0..* Example: (“en”, “The case-study method is used to train and improve decision-making skills. Case studies are well-prepared, structured, and reduced in their content, so that the presented situation is not too overwhelming. The method was developed in business education.”)

8 Annotation This category provides comments on the educational use of this learning object, and information on when and by whom the comments were created. This category enables educators to share their assessments and peer review comments of resources, learners’ feedback, teacher reflections, suggestions for use, etc.,

LOM (CATEGORY) 0..*

8.3 Description The content of this annotation. LOM LangString 1 Example: (“en”, “When using this teaching method make sure to allocate at least one hour for the final reflection round”)

8.4 Type The type of annotation or comment. LOM extension

{ teacher reflection, peer review, student feedback, other }

0..1 Example: “teacher reflection”

9 Classification This category describes where this resource falls within a particular classification system. To define multiple classifications, there may be multiple instances of this category.

LOM (CATEGORY) 0..* In addition to classification purposes identified in the base LOM schema, use this category to identify the subject area, educational setting, and appropriate group size for this resource. It is recommended to have one or more classification elements with Purpose = “discipline”. It is recommended to make use of an established domain specific thesaurus.

9.1 Purpose The purpose of classifying this resource.

LOM extension

{ group size, educational setting } plus LOM base values { discipline, idea, prerequisite, educational objective, accessibility, restrictions, educational level, skill level, security level, competency }

1 It is not recommended to use the educational objective, skill level, educational level, and competency classification purposes, since these should be expressed using Learning Outcome elements (5.12).

9.2 Taxon Path (The “Taxon Path” structure is adopted from LOM.)

LOM (CONTAINER) 0..*

68/92

Page 69: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Id Name Description Refere Data type Multi- Notes and recommendations nce plicity

9.3 Description Description of the resource relative to the stated classification purpose.

LOM LangString 0..1 If Purpose=”educational setting” the description must be one of { online, distant, face-to-face, blended }. Example: (“en”, “Computer Science”)

69/92

Page 70: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

7 Conclusion

This document presented the outcome of an inquiry for IRM, a reference model that provides a common frame of reference for the work in and around the ICOPER project and also in the broader community engaged in research, development and use of TEL systems and applications and e-Learning in general. The main achievement is to provide a meta-model, a structure for the stakeholder activities, the services provisioned for their activities, and the data models enabling the interoperability of content, systems and applications for the provisioning of the services. This IRM structure shows logically, how the need arising from an area of user activity, the solution to their problem can be provided using open educational resources, stored using content and meta-data standards with services rendered by the systems and applications designed following the IRM outline. The process models in the current IRM version define relevant areas of user activity for the purpose of understanding the activities to develop TEL applications to support these activities, and to map the OICS services and data to the stakeholder activity. This provides a means to communicate the benefit of standardized OER and the practical implementation of outcome-based approach to the HE activities. The processes are connected to the stakeholder world by high-level process areas, which are broken down to ICOPER-specific processes, sub-processes and activities, to which the services provisioned by prototype application functionality can be mapped. The IRM itself has emerged as a solution to the design problem of a common construct providing support for the common understanding of the logical designs to follow in the TEL domain. At this point, the meta-structure has been achieved. In the remaining project time, this meta-structure is to be populated with more details and further specifications. The domain standards, which were presented in the earlier work on IRM, D7.1, will be revised and joined to the final IRM deliverable.

70/92

Page 71: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

References

Azouaou, F. Desmoulins, C. (2006). Using and modeling context with ontology in e-learning: the case of teacher’s personal annotation, International Workshop on Applications of Semantic Web Technologies for E-Learning (SW-EL), Dublin: Ireland (2006). BPMN Specification Version 2.0beta, OMG Document Number: dtc/2009-08-14 Dochy, F.J.R.C. and McDowell, L. (1997). Assessment as a tool for learning. Studies In Educational Evaluation, vol. 23, pp. 279-298. Institute of Electrical and Electronics Engineers (IEEE) (1990). IEEE Standard Computer Dictionary: A Compilation of IEEE Standard Computer Glossaries. New York, NY. European Commission. (2008). The European Qualifications Framework for Lifelong Learning. Retrieved August 2009, from http://ec.europa.eu/education/policies/educ/eqf/eqf08_en.pdf Fettke P., & Loos P. (2003). Multiperspective Evaluation of Reference Models – Towards a Framework, Lecture Notes in Computer Science: Conceptual Modeling for Novel Application Domains, 80-91. Springer. Frank U. (2007) Evaluation of Reference Models. In: Fettke P. and Loos P. Reference Modeling for Business Systems Analysis, pp. 118-140, Idea Group Publishing. ISO 9001:2000 Quality management systems – Requirements. ISO (2005). ISO/IEC 19796-1:2005 - Information technology - Learning, education and training - Quality management, assurance and metrics - Part 1: General approach, ISO. JISC (2005) JISC FREMA, available at http://www.frema.ecs.soton.ac.uk. Müller, D. [Editor], Zimmermann, V. & Peters, J. (2009). Report on technical basis of IMS Learning Design to combine instructional models with collaboration services and rich media (open) content, ICOPER project report, 30 September 2009. Olivier, B. & Tattersall, C. (2005). The Learning Design Specification In: Koper, R. & Tattersall, C., Learning Design: A Handbook on Modelling and Delivering Networked Education and Training (pp. 21-40). Berlin-Heidelberg: Springer Verlag. Peffers, K., et al. (2007). A Design Science Research Methodology for IS Research. Journal of Management Information Systems, 2007-8. 24(3): p. 45-77. Quality Assurance Agency for Higher Education (QAA) (2000). Code of practice for the assurance of academic quality and standards in higher education, Section 6: Assessment of students, May. SOA Systems Inc. (2009a) What is SOA? An introduction to Service-Oriented Computing http://www.whatissoa.com/ SOA Systems Inc. (2009b), SOA Principles. An introduction to the Service-Orientation Paradigm. http://www.soaprinciples.com/ . The Open Group (2009). TOGAF Version 9. The Open Group Architecture Framework (TOGAF) Document Number: G091. Thomas, O. (2006). Understanding the term reference model in information systems research: History, Literature Analysis and Explanation. In: Bussler, C. & Haller, A. (Eds.), Business Process Management Workshops: BPM 2005 International Workshops, BPI, BPD, ENEI, BPRM, WSCOBPM, BPS Revised Selected Papers (pp. 484-496). Berlin: Springer.

71/92

Page 72: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Welsch, E. (2002). SCORM: Clarity or Calamity. Online Learning Magazine. Wisher R. (2009). SCORM 2004. 4th Edition, Advanced Distributed Learning, available at http://www.adlnet.gov/Technologies/scorm/SCORMSDocuments/2004%204th%20Edition/Documentation.aspx Wilson S., Blinco K. and Rehak D. (2004). An E-Learning Framework. A summary, available at http://www.jisc.ac.uk/uploaded_documents/Altilab04-ELF.pdf Yorke, M. (2003) Formative assessment in higher education: Moves towards theory and the enhancement of pedagogic practice, Higher Education, vol. 45, Jun. 2003, pp. 477-501.

72/92

Page 73: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Appendix A: Engineering Evaluation

ICOPER Prototype Development - Engineering Evaluation

Dear prototype developer! You and your organisation have invested a significant amount of resources implementing a prototype based on ICOPER specifications. The ICOPER specifications were designed to solve a particular interoperability problem in the context of your organisation. We are now interested in your feedback on these specifications and development effort. Not all questions presented below are relevant for all target groups. It depends very much on your personal profile whether you will feel comfortable with some of the questions. Please do not hestitate to skip some of them if you do not feel yourself in a position of being able to answer a questions or two. On the other hand, of course we are interested in the maximum of feedback you are able to provide to us. We are asking every prototype developer fot provide at least one set of answers to these questions from a developer, but please consider it to get answers also from other stakeholders in your organsation such as managment. Please add it to your prototype description for D1.2. 1. Contact information 1.1 Your roles (Note: if appropriate, select more than one): ( ) Developer (coding the prototype) ( ) Designer (design the interface) ( ) Researcher (working on requirement specifications) ( ) Decision-makers on the development of technology in an organization 1.2 Contact Details: Name: Tel: Mail: Web: 2. Prototype Summary

In your own words, which interoperability problem have you been working on? Which are the technical systems that you were aiming to connect?

73/92

Page 74: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Does your Prototype make full use of the ICOPER Middle Layer Specifications? If not, at which level does it support it? Which functions have you been focussing on? Is your prototype easy to adapt or extend for the new versions of the specification in the future? I worked with the ... ( ) WU Middle Layer Implementation ( ) KU Leuven Middle Layer Implementation Your prototype is addressing the following end users:

( ) Learners ( ) Teachers ( ) Curriculum Developers ( ) Higher Education Management

3. Your Open Feedback (Qualitative Evaluation) 3.1 ICOPER Middle Layer Specifications

3.1.1 What do you think about the ICOPER Middle Layer Specifications? Are they helpful to achieve interoperability? Are they useful? How would you assess their relevancy?

3.1.2 Are the specifications sufficiently adaptable / extensible? Are data models sufficiently expressive / complete? Are service specifications sufficiently feature-rich / complete?

3.1.3 Were the key concepts introduced clear to you? Transferable also to your (sub-)domain?

3.1.4 Are the ICOPER Middle Layer Specifications programmer-friendly (easy to read? understandable? easy to implement?)? How would you assess the skill level on the programmer side for implementing these specs?

3.1.5 How would you summarize the highlights of the ICOPER Middle Layer Specifications? How do they differentiate themselves from other specs you are aware of? Would you consider them innovative / highly orginal work?

3.1.6 How would you summarize the lowlights and major weaknesses of the ICOPER Middle Layer Specifications? Have you been able to identify major weaknesses as compared to other implementations? Have you been able to identify particular things that really need to be changed? Please assess also the priority of your proposal for change. 3.1.7 Which opportunities – and potentially threats - do the ICOPER Middle Layer Specifications create for the ICOPER consortium? Do you think this work will contribute to the superiority of our products and services / our reputation in the field? Is the work inspiring to others? Will it lead to surprising results? How would you assess the

74/92

Page 75: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

opportunities and threats quantitatively (high - low)? What is required to seize the opportunities / overcome threats?

3.2 Implementation Support

3.2.1 Have you felt sufficiently supported by experts and tools?

3.2.2 How would you summarize the highlights of the ICOPER Middle Layer Implementation? How do they differentiate themselfs from other specs you are aware of? Would you consider them innovative / highly orginal work?

3.2.3 How would you summarize the lowlights and major weaknesses of the ICOPER Middle Layer Implementation? Have you been able to identify major weaknesses as compared to other implementations? Have you been able to identify particular things that really need to be changed? Please assess also the priority of your proposal for change. 3.2.4 Which opportunities – and potentially threats - does the ICOPER Middle Layer Implementation create for your organisation? Do you think this work will contribute to the superiority of your products and services? Is the work inspiring to you? Will it lead to surprising results? How would you assess the opportunities and threats quantitatively (high - low)? What is required to seize the opportunities / overcome threats?

4. Your Quantitative Evaluation 4.1 ICOPER Middle Layer Specifications What do you think about the ICOPER Middle Layer Specifications in general? 1: very bad 2 3 4 5 6 7: very good Are they helpful to achieve interoperability? 1: Not at all 2 3 4 5 6 7: very helpful The problems the ICOPER Middle Layer Specifications tackles can be considered ... 1: Not at all relevant 2 3 4 5 6 7: very relevant I consider the ICOPER Middle Layer Specifications ... 1: Not adaptable at all 2 3 4 5 6 7: highly adaptable I consider the ICOPER Middle Layer Specifications ... 1: very difficult to understand 2 3 4 5 6 7: very easy to understand I consider the ICOPER Middle Layer Specifications ... 1: Not programmer-friendly at all 2 3 4 5 6 7: highly programmer-friendly To what degree the following attributes are achieved by the ICOPER Middle Layer specification:

75/92

Page 76: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Conceptual integrity 1 realized to a high degree 2 3 4 5 6 7: not at all realized Correctness 1 realized to a high degree 2 3 4 5 6 7: not at all realized Completeness 1 realized to a high degree 2 3 4 5 6 7: not at all realized Balanced specificity 1 realized to a high degree 2 3 4 5 6 7: not at all realized Implementation transparency 1 realized to a high degree 2 3 4 5 6 7: not at all realized Evolvability 1 realized to a high degree 2 3 4 5 6 7: not at all realized Extensibility 1 realized to a high degree 2 3 4 5 6 7: not at all realized 4.2 ICOPER Middle Layer Implementation What do you think about the ICOPER Middle Layer Implementation in general? 1: very bad 2 3 4 5 6 7: very good It it helpful to achieve interoperability? 1: Not at all 2 3 4 5 6 7: very helpful The problems the ICOPER Middle Layer Implementation tackles can be considered ... 1: Not at all relevant 2 3 4 5 6 7: very relevant I consider the ICOPER Middle Layer Implementation ... 1: Not adaptable at all 2 3 4 5 6 7: highly adaptable I consider the ICOPER Middle Layer Implementation ... 1: very difficult to work with 2 3 4 5 6 7: very easy to work with I consider the ICOPER Middle Layer Specifications ... 1: Not programmer-friendly at all 2 3 4 5 6 7: highly programmer-friendly For each of the following quality attributes, please state how important they are for your development and to what degree the implementation currently achieves them. Performance 1 very important 2 3 4 5 6 7: not at all important 1 realized to a high degree 2 3 4 5 6 7: not at all realized Security 1 very important 2 3 4 5 6 7: not at all important 1 realized to a high degree 2 3 4 5 6 7: not at all realized Availability 1 very important 2 3 4 5 6 7: not at all important 1 realized to a high degree 2 3 4 5 6 7: not at all realized Portability 1 very important 2 3 4 5 6 7: not at all important 1 realized to a high degree 2 3 4 5 6 7: not at all realized

76/92

Page 77: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Reusability 1 very important 2 3 4 5 6 7: not at all important 1 realized to a high degree 2 3 4 5 6 7: not at all realized Testability 1 very important 2 3 4 5 6 7: not at all important 1 realized to a high degree 2 3 4 5 6 7: not at all realized

77/92

Page 78: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Appendix B: End User Evaluation Plan of Prototypes

  1. ScopingS era

 ev l issues need to be addressed for evaluating the prototypes: (i) Why: Does the evaluation primarily aim to collect benchmark measures for future 

references  or  diagnostic  data  (i.e.  usability  problems  and  improvement suggestions)? Who:  w tar(ii) With  hom  the  prototypes will  be  tested  –  get  groups?  How many  of them? Who can and will conduct the test? 

Which ypes  aim  to meet

(iii) :  Are  there  any  specific  success  criteria  that  individual  protot

W? 

(iv) here: Will the testing involve as single‐site (local) or multiple sites? When: Is(v)  there any specific time constraint (target group availability) to conduct the evaluation? 

vi) Whether: Will initial training (or demonstration) be provided to test participants? If yes, how the training will be presented? 

(

 The  development  team  of  the  respective  prototype  should  consider  these  scoping questions before selecting the options to be described subsequently. The team should note their responses for those questions.  2. Evaluation Testing Protocol 

o3.1 Preparation of the Infrastructure 

Install the executable prototype o Check the availability of the data in the repository, if applicable o Install screen‐capture software – Camtasia Studio (free trial version    http://www.techsmith.com/download/camtasiatrial.asp).  O her  s ilar  software n also be used 

Get  audio‐recording  equipment  ready  (e.g.,  dictaphone,  MP3  player).  Note: Camtasia Studio has the audio‐recording function 

t imcao

 3.2 Recruitment of End­users 

o s test participants (cf. corresponding  Representative end‐users of the prototype ause case description) 

Recommended number of test participants : minimum 5 o 

Data Collection Procedure (i) Introduction  &  Pre­test  Interview:  Explain  the  aim  and  procedure  of  the 

evaluation  to  the  test participant.    Specifically,  it  is  important  to mention explicitly  the  standard  to  which  the  prototype  is  related.  After  then, conduct  the  semi‐structured  interview  with  the  given  questions.  The interview should be audio‐taped.  

3.3

78/92

Page 79: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

 oteN :  We  should  allow  the  test  participants  to  respond  in  their  mother ongue, if they prefer. In this case, verbal protocols need to be translated.   t 

(ii) uestionnaire   

(iii) Task  scenario:  The  test  participant  is  asked  to  carry  out  the  step‐by‐step actions  defined  in  the  use  cases  that  the  prototypes  are  supposed  to support.  The  prototypes  and  the  corresponding  use  cases  are  accessible: 

Background q

https://docs.google.com/Doc?docid=0ASHu6pCvngDDYWg4ZzJyZGM2bXRjXzQ2  Y3E4dmM3Y3I&hl=en_GB&pli=1  For  example,  Clix  testing  users  should  carry  out  the  actions  to  update Learning Outcome Profile (LOP), to send LOP to OICS, etc. Or AGH Facebook 

able  to  import  and  view  the  LOP.  Or  UMU le to retrieve a UoL 

application  users  should  be Moodle module should be ab(See Sample Task Scenario) Many  of  the  use  cases  descriptions  could  be  found  here http://www.educanext.org/dotlrn/clubs/icoper/wp1/new‐lors/scenarios   Please note that some of the use cases found in Educanext may be outdated as  the  project  progresses.  The  correct  updated  use  case  step‐by‐step ctions may  need  to  be  discussed  between  the  development/testing  site nd the Tas  force leaders. aa 

k

(iv) hT ink aloud:  a. Concurrent  –  ask  the  test  participant  to  verbalise  what  is  in  his  mind 

when carrying out  the  task. The rationale  is  to understand how  the  test participant thinks and feels about the interaction with the prototype.   

b. Retrospective – if the test participant is not comfortable with concurrent think aloud, the alternative is to replay the screen recording (provided it 

articipant is asked to comment on the actions of is available) and the test pinterest.  

udio‐record g is advisable. A 

in

(v) Observation:  In  addition  to  think  aloud,  the  experimenter  can  observe  the  test participant’s  verbal  and  non‐verbal  behaviour  and  task  performance, enabling different measures  to be  taken.  Specifically,  efficiency –  the  time required  to  complete  the  task  (which  can  automatically  be  recorded with the  screen‐capture  software)  and  effectiveness  –  the  number  of  different ypes  of  errors  committed  –  can  be  registered.  More  important  is  that sability problems that the user experiences can be noted  tu 

79/92

Page 80: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

80/92

(vi) Post­test Questionnaire After completing the task scenario, the test participant is asked to complete the questionnaire entitled “Integrated User Satisfaction and Acceptance of Information Technologies” (IUSAIT). Specifically,  IUSAIT  is built upon an  integrated evaluation model drawing on  DeLone  and McLean’s  Model  of  Information  System  Success  (D&M  IS model) (DeLone & McLean, 2003), Technology Acceptance Model 3 (TAM3) Venkatesh & Bala, 2008; Venkatesh, Morris, Davis, & Davis, 2003) and the BM Computer Usability Satisfaction Questionnaires (Lewis, 1993, 1995). (I ost­test Interview 

t(vii) P

The test participant is in erviewed to elaborate his/her opinions on various issues (Appendix F).  The interview needs to be audio‐taped. Note:  We  should  allow  the  test  participants  to  respond  in  their  mother tongue, if they prefer. In this case, verbal protocols need to be translated.   

 3. Data Analysis  4.1 Qualitative Data 

al p tes  rotocols Think aloud verbObservational node   Vi o recording

Audio recording Content  analysis  will  be  applied  to  these  data  to  extract  usability  problems  and identify improvement suggestions 

 4.2 Quantitative Data 

ate    Task completion rTask completion time Number of errors   Results of the IUSAIT 

Statistical  analysis  will  be  applied  to  validate  the  underlying  theoretical  model  and derive patterns of user satisfaction and acceptance towards the prototype  

Note: Details to be given later.  Evaluation data can be sent to ULE for data analysis.   

Page 81: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

81/92

Introduction & Pre­test Interview  Instruction: You can read the  introductory text below aloud or print  it out  for the test participants  to  read  by  themselves.  When  they  finish  reading  it,  proceed  with  the interview. INTRODUCTION 

The  European  eContent+  Network  of  Excellence,  ICOPER,  is  concerned  with  the technology‐enhanced, standards‐based  implementation of an outcome‐oriented higher education. The project analyzed and discussed state‐of‐the art  implementations of  the European  Qualification  Framework,  Learning  Design,  SCORM,  and  QTI.  Based  on  this analysis work ICOPER aims at contribution to the next generation of  learning tools by the  development  and  evaluation  of  a  comprehensive  set  of  prototypes.  As  we  much value  user‐involvement  and  openness,  we  would  like  to  invite  you  to  share  your feedback on these prototypes. Please let us ask and discuss the following questions. *********** * * * ** **** ********************************************* ****** **************** Instruction:  The  interview  should  cover  two  major  areas.    The  questions  in  the respective  section  are  some  suggestions.  Please  modify  them  and  add  new  ones whenever  appropriate  and  necessary.    Please  write  down  that  your  modified  and additional questions.  

INTERVIEW QUESTIONS 

1. hot topic in higher education.  Outcome orientation is currently a  a. What do think about this trend?  

b. What would be your institution’s stance on an outcome‐oriented learning design and delivery? 

c. What strengths and weaknesses would you associate with your institution’s current stance on this issue? 

 2. The ro cess [XXX].   p totype we are going to present is implementing Use Case / Pro

 institution? a. How is this process currently implemented in your b. Which technologies are you using to support this?  

c. Which kind of support is provided on an organisational level?   d. How would you assess the status quo with respect to quantity and quality of the content/resources provided?  

 e. What strengths and weaknesses would you associate with your institution’scurrent implementation?  

f. Without having seen our proposal, can you please mention a few functional (i.e. features) and non‐functional (i.e. other aspects of the technology such as user‐friendliness) requirements of the technology support? 

Page 82: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Pre­Test Questionnaire 

articipant Code: ________________ (to be filled by the Experimenter)   Date:_____________  1. G rende le      male   

 – 24   25 – 29   30 – 39   40 – 49     >= 50 

:   ma fe 2. Age:    20 

Your3.  job title: 

       University faculty member 

   Researcher    Software developer/engineer    Project manager     University student (specify academic level: ___________________________) 

_____________________________________________________)    Other (specify: _______

 4. Your area(s) of expertise:   5. Please  ate your competence in relation to Information and Communication 

Technologies (Ir

CT): Poor  

Very poor 

Fairly Poor  Medium  Fairly Rich Rich  Very Rich 

              6. your experien no han in  ( learnPlease rate 

Very poor ces in Techairly Poor 

logy‐enMedium 

ced Learnairly Rich

g or e‐Rich 

ing): Very Rich Poor  F F

              7. Please rate your familiarity with the Standard/Specification: [To be replaced by the 

exact  e of the Standa  w pro built] namVery poor 

rd/Spec onairly Poor 

hich the Medium 

totype is airly RichPoor  F F Rich  Very Rich 

              8. For each of the following statements, please rate your level of agreement with a 7‐

point scale:   Str

Dongly                                                Strongly    isagree                                                    Agree     

I could complete the job using a web‐based application … if there was no one around to tell me what to do as I go  1     2     3      4     5       6     7  

… if I had just the built‐in help facility for assistance  1     2     3      4     5       6     7  

82/92

Page 83: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

83/92

… if someone showed me how to do it first  1     2     3      4     5       6     7  

… if I had used similar packages before this one to do the same job  1     2     3      4     5       6     7  

Thank you for your kind cooperation! 

Page 84: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Sample Task Scenario  Task scenario: Retrieve units of learning   Instruction:  lease read the following task scenario carefully and implement it step‐by‐step with the rototype given. Pp 

uired to think aloud when   adapted, on the selected workflow. 

You are req carrying out the task. An option is to bedepending 

 extracted from D2.1] Use case: <Retrieve units of learning> [This use case is• Goal: The actor retrieves one or more units of learning • Actors: Learner/Teacher • Conditions: The actor is authorised to retrieve the units of learning • Description: The actor provides  the  system with  the desired  learning outcomes and 

d qualifiers. From such  input,  the system returns one or the corresponding context anmore units of learning • Steps (or course of events): Actor action  System respond 

1. This use case begins when the learner/teacher accesses the system 

 retrieve one or more with intention tounits of learning 

es  the 2.  The  learner/teacher  providsystem 

e with the learning outcome and th

can provide more context 3. The learner requirements 

 

4. The  system  checks  the  input  and search  in  its repository/repositories  for  the appropriate unit(s) of learning 

5. rner  gets  the  results  from The  leathe system

 • Variation 

<Step 3>: - The  learner  requires  one  or  more  units  of  learning.  In  this  case,  the  system 

identifies the requested unit(s) of learning in <Step 4> 

o

84/92

Page 85: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

85/92

- The  learner  requires  one  or more  teaching methods  associated with  a  unit  of learning. In this case, the system identifies the requested teaching method(s) in <Step 4> he learner requires one or more assessments. In this case, the system identifies - T

• Issuesthe assessment(s) in <Step 4> 

o  <Units of learning>: More details about units of learning will be available in D3.1 

o ilabl   in <Teaching methods>: More details  about  teaching methods will be ava e3.1 

.1   D

o • Resul

<Assessment>: More details about assessments will be available in D6

o t Success: The learner/teacher retrieves one or more units of learning 

o Failure: The learner/teacher does not retrieve any unit of learning or retrieves only some of the requested units of learning

Page 86: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

Data­Logging Form  

Note: A new Data‐Logging Form is used for each test participant (copy the following table as many times as required) 

Usability Test  e XXXX of the Prototyp

Participant ID:   M = menu choice error  O = Online help 

Date:   S = select from list error  H = Help desk 

Tester/ Institute:  E = other error  F = Frustration/confusion 

Task  Completion Time 12 

M  S  E  O  H  F  Participant’s Thinking Aloud Protocol  

Local Tester’s Observations and Comments13 

12 Start­time: when the user starts to work on the computer after reading the task description. End­time: when the user finishes the task or gives it up.  Completed: indicate whether the user is able to complete the task or not. 

13  The Local Tester should describe in detail what she observes when the test participant is experiencing some problem with the use of the system. In other words, the context of a problem needs to be described very clearly. Specifically, the Local Tester should explicitly state based on which criteria he thinks that the participant has encountered usability problems and also rate the severity of individual problems identified according to the following scheme: 

• Severe usability problems are those that prevent the user from completing a task or result in catastrophic loss of data or time. • Moderate usability problems are those that significantly hinder task completion but for which the user can find a work-around.

 • Minor usability problems are those that are irritating to the user but do not significantly hinder task completion.

86/92

Page 87: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

87/92

Task 1  Apply for a user account      

Start‐time:   End‐time:   Completed: Yes / No 

               

 

 

Page 88: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

2 88/9

 Post­Test Questionnaire: 

Integrated User Satisfaction and Acceptance of Information Technologies (IUSAIT)  nstructionI : Please read each of  the  following statements and  indicate your agreement y marking an “X” in the appropriate box. b   Strongly                                                      Strong

Disagree                                                      Agree ly 

System Quality 1. The prototype has all the functions and 

capabilities I expect it to have.  1    2    3    4     5     6     7     NA  

2. The prototype can reliably recover from error.  1    2    3    4     5     6     7     NA  

3. The interface of the prototype is pleasant  1    2    3    4     5     6     7     NA  4. ) 

r  The organisation of the user interface (UIelements on the prototype screens is clea 1    2    3    4     5     6     7     NA  

5.  The prototype gives error messages that clearly tell me how to fix problems  1    2    3    4     5     6     7     NA  

6. The information (such as online help, on‐n) screen messages and other documentatio

provided with this prototype is clear 1    2    3    4     5     6     7     NA  

7. It is easy to find the information (such as from the online help) I need  1    2    3    4     5     6     7     NA  

Output Quality 8. …  accurate 

 

 The output I get from the prototype is 

…  complete…  relevant …  consistent 

tandable …  clear …  unders…  timely 

 1    2    3    4     5     6     7     NA  1    2    3    4     5     6     7     NA  1    2    3    4     5     6     7     NA  1    2    3    4     5     6     7     NA  1    2    3    4     5     6     7     NA  1    2    3    4     5     6     7     NA  1    2    3    4     5     6     7     NA  

Perceived Usefulness 9. I can effectively complete my work using t

prototype his 

1    2    3    4     5     6     7     NA  

10. I am able to efficiently complete my work using this prototype 

1    2    3    4     5     6     7     NA  

11. Using this prototype in my work inmy productivity 

creases 1    2    3    4     5     6     7     NA  

12. Using this prototype improves my performance in my work. 

1    2    3    4     5     6     7     NA  

13. k. I find the prototype to be useful in my wor 1    2    3    4     5     6     7     NA  

Perceived Ease of Use 14. My interaction with the prototype is clear  1    2    3    4     5     6     7     NA  

Page 89: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

89/92

and understandable. 15. Interacting with the prototype do es not 

require a lot of my mental effort.  1    2    3    4     5     6     7     NA  

16. It is simple to use the prototype.  1    2    3    4     5     6     7     NA  17. I find it easy to get the system to do wh

want it to do.  at I 

1    2    3    4     5     6     7     NA  

18. e. It is easy to learn to use the prototyp 1    2    3    4     5     6     7     NA  

Perceived Enjoyment   19. I find using the prototype enjoyable.  1    2    3    4     5     6     7     NA  20. The actual process of using the 

pleasant.  prototype is 

1    2    3    4     5     6     7     NA  

21. I have fun using the prototype.  1    2    3    4     5     6     7     NA  

Behavioural Intention 22. Assuming I had access to the prototype, I 

intend to use it.  

1    2    3    4     5     6     7     NA  

23. Given that I had access to the prototype,predict that I would use it. 

 I 1    2    3    4     5     6     7     NA  

24. I plan to use the prototype in the next 6 months  1    2    3    4     5     6     7     NA  

Standard Support 25. The prototype supports the standard to 

which it is related. 

1    2    3    4     5     6     7     NA  

26. It is easy to extend the prototype to support the newer versions of the related standard. 

1    2    3    4     5     6     7     NA  

27. The prototype can reflect the main feature of the standard.  1    2    3    4     5     6     7     NA  

Overall satisfaction 28. Overall, I am satisfied with the prototype.  1    2    3    4     5     6     7     NA  

  

   

Page 90: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

90/92

Post­test Interview  

Note:  The  interview  should  cover  three major  areas.    The  questions  in  the  respective section  are  some  suggestions.  Please  modify  them  and  add  new  ones  whenever ppropriate  and  necessary.    Please  write  down  that  your  modified  and  additional uestions.  aq A. Prototype­Specific Questions 1. What do you think about the implemented process and user roles, i.e. workflow 

support? Consider the following qualities (how would you rate them, say 1 to 10, and why): 

a. Usefulness  b. Completeness c. Degree of adaptability . rsonalization  lity with your relevant systems 

d Degree o f pee. Interoperabif. Ease of use g. Learnability 

2. How does the prototype differ from the other systems you are aware of? Would you call it a very innovative work? 

e3. What are the strengths and weaknesses of the current implem

ntation? 4. What changes would you propose to improve the prototype?  5. How relevant do you think the existing prototype implementation and your 

proposed changes are from your organisation’s point‐of‐view?  (Prototype Developers may be aware of certain implementation options and can then specifically ask for an evaluation of these alternatives) 

B. Information System Evaluation  

stem (IS) Imagine the prototype is introduced in your organisation and an information syincluding end user support and others are successfully created around it.   

 1. Is the prototype inspiring you? Can you foresee an implementation in yourorganisation?  

2. Do you think an information system based on the prototype can give your organisation a cutting edge/lead it to superiority? 

3. What about the key concepts introduced, are they also applicable in your context?  4. How would you would assess the opportunities and threats would such an IS create 

for your organisation? What is required to seize the opportunities/overcome threats? 

 C. Critical Success Factors for Implementation and Deployment 

1. From your point‐of‐view: what are the critical success factors for such a system? Please address: 

Page 91: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

91/92

a) user‐perspective (competencies, attitudes etc.)  

n etc.) al requirements etc.) and 

b) organisational perspectives (organisational integratioc) technological perspectives (must‐have non‐/functiond) perspectives related to content and other resources  Who are the relevant stakeholders (or policy decision‐makers) that need to be 

ction of such a prototype? 2.

considered or consulted when it comes to the introdu   

Page 92: ICOPER Reference Model Specification Draft - New …nm.wu.ac.at/nm/resources/icoper/ICOPER_D7.3a.pdf ·  · 2015-02-11The previous ICOPER delivarable on the IRM development, D7.1,

D 7.3a ICOPER Reference Model Specification Draft

92/92

References DeLone, W. H., & McLean, E. R.  (2003). The DeLone and McLean Model of  Information 

S s m s: A ten‐year update.  Journal of Management Information Systems, 1y te  Succes9(4), 9‐30. 

 Lewis,  J.  R. (1993).  IBM  Computer  Usability  Satisfaction  questionnaires:  psychometric evaluation and instructions for use. Boca Raton, FL, USA: IBM Corporation. 

Lewis,  J.  R.  (1995).  IBM Computer Usability  Satisfaction Questionnaires:  Psychometric Evaluation  and  Instructions  for  Use.  International  Journal  of  Human­Computer Interaction, 7(1), 57‐78. 

s dVenkatesh, V., & Bala, H. (2008). Technology Acceptance Model 3 and a Re earch Agen a on Interventions. Decision Sciences, 39, 273‐315. 

enkatesh,  V.,  Morris,  M.  G.,  Davis,  G.  B.,  &  Davis,  F.  D.  (2003).  User  acceptance  of information technology: Towards a unified view. MIS Quaterfly, 27(3), 425‐478. 

V