JISC final report template - PBworksjiscdesignstudio.pbworks.com/.../Rogofinalreport.docx  · Web...

44
Rogō OSS Final Report Project Information Project Identifier To be completed by JISC Project Title Rogō OSS Project Project Hashtag Start Date 01/09/2011 End Date 31/01/2013 Lead Institution University of Nottingham Project Director Andy Beggan Project Manager Dr Simon Wilkinson Contact email [email protected] Partner Institutions De Montfort University (http://www.dmu.ac.uk/home.aspx) University of East Anglia (http://www.uea.ac.uk/) University of Bedfordshire (http://www.beds.ac.uk/) University of Oxford (http://www.ox.ac.uk/) University of the West of Scotland (http://www.uws.ac.uk/) Project Web URL http://Rogō-oss.nottingham.ac.uk/ Programme Name Assessment and Feedback Programme Programme Manager Heather Price Document Information Author(s) Dr Simon Wilkinson Project Role(s) Project Manager Date 21/11/2012 Filename URL Access This report is for general dissemination Document History Version Date Comments Page 1 of 44

Transcript of JISC final report template - PBworksjiscdesignstudio.pbworks.com/.../Rogofinalreport.docx  · Web...

Rogō OSS Final Report Project Information

Project Identifier To be completed by JISC Project Title Rogō OSS ProjectProject HashtagStart Date 01/09/2011 End Date 31/01/2013Lead Institution University of NottinghamProject Director Andy BegganProject Manager Dr Simon WilkinsonContact email [email protected] Institutions De Montfort University (http://www.dmu.ac.uk/home.aspx)

University of East Anglia (http://www.uea.ac.uk/)University of Bedfordshire (http://www.beds.ac.uk/)University of Oxford (http://www.ox.ac.uk/)University of the West of Scotland (http://www.uws.ac.uk/)

Project Web URL http://Rogō-oss.nottingham.ac.uk/Programme Name Assessment and Feedback ProgrammeProgramme Manager Heather Price

Document InformationAuthor(s) Dr Simon WilkinsonProject Role(s) Project ManagerDate 21/11/2012 FilenameURLAccess This report is for general dissemination

Document HistoryVersion Date Comments

Page 1 of 29

Table of Contents

1 ACKNOWLEDGEMENTS.......................................................................................................................... 3

2 PROJECT SUMMARY.............................................................................................................................. 3

3 MAIN BODY OF REPORT........................................................................................................................ 3

3.1 PROJECT OUTPUTS AND OUTCOMES.............................................................................................................................33.2 HOW DID YOU GO ABOUT ACHIEVING YOUR OUTPUTS / OUTCOMES?..................................................................................5

3.2.1 Aims..............................................................................................................................................................53.2.2 Methodology................................................................................................................................................5

3.3 WHAT DID YOU LEARN?.............................................................................................................................................63.3.1 Competing Pressures....................................................................................................................................63.3.2 Installation....................................................................................................................................................63.3.3 Servers..........................................................................................................................................................73.3.4 Integration....................................................................................................................................................73.3.5 Customisation...............................................................................................................................................73.3.6 Standards......................................................................................................................................................73.3.7 Support.........................................................................................................................................................83.3.8 Wider Community Engagement....................................................................................................................8

3.4 IMMEDIATE IMPACT...................................................................................................................................................93.4.1 Wider user-base............................................................................................................................................93.4.2 Diverse Marketplace.....................................................................................................................................9

3.5 FUTURE IMPACT........................................................................................................................................................9

4 CONCLUSIONS....................................................................................................................................... 9

4.1 GENERAL CONCLUSIONS.............................................................................................................................................94.2 CONCLUSIONS RELEVANT TO JISC...............................................................................................................................10

5 RECOMMENDATIONS.......................................................................................................................... 10

5.1 BETTER INSTALLATION..............................................................................................................................................105.2 SUPPORT CONTRACT................................................................................................................................................105.3 AUTHENTICATION....................................................................................................................................................105.4 DEVELOPMENT.......................................................................................................................................................10

6 IMPLICATIONS FOR THE FUTURE..........................................................................................................11

6.1 SUSTAINABILITY.......................................................................................................................................................116.2 INCREASED OPENNESS..............................................................................................................................................11

6.2.1 Increased Integration..................................................................................................................................116.3 SUPPORT CONTRACT................................................................................................................................................116.4 TESTING PROCESSES................................................................................................................................................11

7 REFERENCES........................................................................................................................................ 12

8 APPENDICES........................................................................................................................................ 13

8.1 EVALUATION QUESTIONNAIRE..................................................................................................................................138.1.1 What was learnt from your time on the project?........................................................................................138.1.2 How have your views/attitudes altered to e-assessment?..........................................................................148.1.3 How have your views/attitudes altered to open source software?.............................................................148.1.4 Will you be continuing to pilot Rogō in the coming year?...........................................................................158.1.5 Are there barriers from moving from the piloting phase to a production service phase?...........................168.1.6 Do you need more help from the Rogō development team, more management buy in, more academic or student buy in, etc?....................................................................................................................................................168.1.7 Are there significant gaps in the functionality of Rogō which need addressing?........................................17

8.2 ROGŌ PROJECT REPORT - DE MONTFORT UNIVERSITY...................................................................................................188.3 ROGŌ OPENNESS EVALUATION..................................................................................................................................20

Page 2 of 29

1 Acknowledgements

The University of Nottingham is grateful for the time and effort made by our project partners (Befordshire, De Montfort, East Anglia, Oxford and the University of the West of Scotland) in seeking to launch Rogō as an open source community. Thanks also to Scott Wilson and Sander van der Waal from OSS Watch for their advice and to Ross Gardler from OpenDirective for his work on the openness report.

2 Project Summary

The Rogō OSS project aimed to take software developed at the University of Nottingham and package it up and create a vibrant open source development community. Rogō itself is an e-assessment management system aimed at the higher education enterprise level. Over the past year the University of Nottingham has worked closely with five partners to install and pilot Rogō running within their own institutions: Bedfordshire, De Montfort, East Anglia, Oxford and the University of the West of Scotland.

All partners succeeded in installing and piloting Rogō in their own institutions which points to Rogō as a viable piece of open source software. The main challenges observed during the project were: 1) installation, 2) server provision and 3) integration with other institutional systems. The development team have worked hard to resolve these issues. Rogō now comes with its own installation script and can install within a sub-directory (something the original version could not). Advice about server setup and configuration is now available on the project website. Integration with third party systems is primarily in the form of LDAP for authentication and new LTI functionality which allows a VLE or other system to launch and single sign into Rogō.

3 Main Body of Report

3.1 Project Outputs and Outcomes

Output / Outcome

Type

Brief Description and URLs (where applicable)

Work with partner HEIs to install and pilot Rogō at their institutions.

The University of Nottingham has, through a variety of methods (site visits, email, telephone, etc), helped all five partners to successfully install Rogō locally within their institutions.

Support materials to assist installation and rollout beyond pilot and server installer.

A getting started guide has been written in the TRAC wiki:https://suivarro.nottingham.ac.uk/trac/Rogō/wiki/WikiStart

This has been split into four sections: 1) System Installation, 2) Initial Settings, 3) External System Integration and 4) Branding.

A server installer script has also been written which creates the necessary database structure and sets variables in the configuration file.

Development roadmap.

Partners have been posting requirements as a series of tickets into the TRAC system.

Ones agreed as a priority by the development team and combined with local University of Nottingham requirements and summarised on a high-level roadmap within the wiki: https://suivarro.nottingham.ac.uk/trac/Rogō/wiki/RoadmapSummary

Guidelines and system administrator documentation.

User documentation ships as part of the online help system within Rogō. Developer documentation has been built within the wiki on TRAC: https://suivarro.nottingham.ac.uk/trac/Rogō/wiki

A getting started guide has been split into four sections: 1) System Installation, 2) Initial Settings, 3) External System Integration and 4) Branding. The database tables and fields have also been completely documented.

Integration with Moodle

IMS LTI launch was developed for Rogō 4.2.2 (23/05/2012).IMS LTI marks return to consumer tool was developed for Rogō 4.3.1 (02/11/2012).

Page 3 of 29

2.x (exploring using IMS LTI).Upgrade script.

An upgrade script has been written that can alter the format of the database to suit new versions of Rogō. It has been designed in a way so that it knows what the current version is and can make all modifications necessary to be compatible with the latest release.

Partner reflections.

University of Oxford – http://youtu.be/oU_cwKlTZzYDe Montfort University – http://www.youtube.com/watch?v=DV2Bg2FwMxAUniversity of the West of Scotland – https://www.youtube.com/watch?v=LY3-rKfAk7E

Released software versions.

Source code available under GPL v3.0 open source license.4.3.1 - 02/11/20124.3 - 24/09/20124.2.4 - 21/06/20124.2.3 - 30/05/20124.2.2 - 25/05/20124.2 - 20/03/20124.1.1 - 17/11/20124.1 - 10/11/2012

Active community of users and developers.

The focus for building a community of users/developers has been around the TRAC website: https://suivarro.nottingham.ac.uk/trac/Rogō/

This provides a single place for: Code release Documentation wiki Ticketing system to report bugs, request new features Test Plans

Project website.

A new website has been created at:http://Rogō-oss.nottingham.ac.uk

Dissemination outputs.

Conferences – members of the development team attended, presented and ran stands at a number of conferences:

Dev8D, London – lighting talk (14/02/2012) Dev8ed, Birmingham (29/05/2012) CAA 2012, Southampton (10/07/2012)

o Stando Paper

(http://caaconference.co.uk/pastConferences/2012/caa2012_submission_22.pdf)

ALT-C 2012, Manchester (11/09/2012) - Stand Assessment Practitioners Group, Nottingham (02/11/2012) – Presentation

(https://jiscsupport-assessmentandfeedback.pbworks.com/w/file/63459414/Assessment%20Practitioners%20Group.pptx)

eAA Regional Event, Manchester (23/11/2012) – Presentation (http://www.e-assessment.com/civicrm/event/info?id=104&reset=1)

Promotional materials.

Double-side A5 flyer (https://jiscsupport-assessmentandfeedback.pbworks.com/w/file/63459150/rogo_a5_flyer.pdf) created and made available at:

CAA 2012 ALT-C 2012 eAA Regional Event

Poster: CAA 2012 (https://jiscsupport-assessmentandfeedback.pbworks.com/w/file/63459234/rogo_a0_poster.pdf)

Banner: ALT-C 2012 (https://jiscsupport-assessmentandfeedback.pbworks.com/w/file/63459277/rogo_pullup_banner.jpg)

Project blog. A number of blog postings were made during the project on the University of Nottingham’s Learning Technology blog, one at the University of Oxford Medical School and an interview held with OSS Watch:

Page 4 of 29

http://comms.nottingham.ac.uk/learningtechnology/category/Rogō/ https://learntech.imsu.ox.ac.uk/blog/?cat=3 http://osswatch.jiscinvolve.org/wp/2012/09/13/Rogō-an-open-source-solution-for-high-

stakes-assessment/

3.2 How did you go about achieving your outputs / outcomes?

3.2.1 Aims

The aims of this 12 month project were to update, promote and create support materials to facilitate the release of Rogō as a publically available open source e-assessment tool. In particular to create, as a community, a leading e-assessment management system that is scalable, secure, re-useable and extensible.

3.2.2 Methodology

Aims and ObjectivesThe first step in the project was to work out the aims and objectives. Like any system it is important to define the boundaries of what will be undertaken. With Rogō the emphasis was to pilot the system at five institutions and make amendments to the system to make it easier for an open source community to be built around it. It was decided that with the exception of LTI support, the project would not concentrate on a large development programme for new features.

Project ManagementTo manage communication across the project partners a mailing list (Rogō[email protected]) and TRAC website created (https://suivarro.nottingham.ac.uk/trac/Rogō/). The functionality of TRAC has the advantage that it provides in a single place: wiki, ticketing system and test plan manager. Regular blogs were also posted to the Learning Technology Section’s blog:http://comms.nottingham.ac.uk/learningtechnology/category/Rogō/

Key ActivitiesAlthough the aim of the project was not the development of major new functionality, there were a few areas of development needed to make Rogō easier to use for a wider range of institutions. To this end a number of sub-projects were completed:

1. Development of installation script. This creates the necessary database structure, sets up the first SysAdmin user account and configures basic system variables.

2. Responding to advice from JISC regarding a possible trademark infringement, the project name was changed to Rogō and the software and website rebranded.

3. Upgrade script was developed which can be run after a new version of Rogō is released. This was created, like Windows, to automatically know the changes required rather than to run a specific version to version upgrade.

4. The ability to install into a sub-directory was prioritised as this was holding one partner back from piloting.

5. The existing QTI 1.2 import routines were enhanced to offer better support for question import from other systems.

6. New LTI 1.0 support was built to enable closer integration with other systems such as VLEs.7. A new language abstraction layer was built with English and Polish language packs shipping as

standard. This development will allow third parties to write for other languages in a way which is independent of the underlying, and changing, source code.

8. With a view to increasing engagement by potential developers a lot of the styles we abstracted out into common CSS files and a lot of the code re-written in a cleaner more object oriented way.

Wider Community EngagementA number of techniques were employed to engage the wider community. A mailing list was created and now has 20 subscribers. Members of the development team also attended and participated in a number of

Page 5 of 29

conferences, workshops and other events attended by a wide variety of different institutions. Personal communication through email was often used by institutions considering piloting Rogō. Testing ReviewExperience during the first half of the project highlighted the importance of a robust and comprehensive approach to finding software defects. With this in mind it was decided to employ a contractor to come in and review current practices and to advise on a long-term test strategy. EvaluationAs part of the JISC Assessment & Feedback Programme Meeting (17 th October, 2012), three of the project partners created YouTube clips about their experience on the project:

University of Oxford – http://youtu.be/oU_cwKlTZzYDe Montfort University – http://www.youtube.com/watch?v=DV2Bg2FwMxAUniversity of the West of Scotland – https://www.youtube.com/watch?v=LY3-rKfAk7E

A questionnaire was also sent to all project partners to elicit their views of the project and open source software more generally. Full responses can be seen in Section 8.1 below.

De Montfort University also provided a report of their experience dated 8th October 2012.

The University of Nottingham through communication with OSS Watch worked with OpenDirective to provide information on current development and engagement practices. Based on this information a report on the project’s openness was produced (see Appendix 8.3).

SustainabilityA meeting with OSS Watch was held to discuss future sustainability models for both Rogō and Xerte open source projects.

DisseminationSee section 3.1 above for full details.

3.3 What did you learn?

3.3.1 Competing Pressures

The first finding, somewhat unexpected, was the pressure of running Rogō ‘internally’ for the University of Nottingham, and running Rogō for the open source community. Although the code is the same, the expectations of these two communities differ. The main source of tension has been around work which would benefit the community but not directly the University of Nottingham. For example, the creation of a language abstraction layer and switch to UTF-8 to support foreign characters was not a requirement locally. However, it was a calculated decision that by developing the ability to add language packs could encourage institutions from non-English speaking countries and so thus expand the potential open source community. As with many of these decisions there is a short-term investment of time in the hope of future return in the mid to long-term.

3.3.2 Installation

Installation proved more of a challenge than expected. There appear to be five contributing reasons/aspects:

1. The project did not start until the beginning of the academic year, which we suspect proved challenging for those who already have an established learning technology infrastructure for the coming academic year.

2. The number of combinations of Apache, MySQL and PHP are large. One site encountered authentication issues which were subsequently tracked back to a lack of encryption support in a specific version of PHP. A work-around was developed by Nottingham to fall back on an older encryption method.

3. Where partners were working within a single school and not across the institution, further delays were introduced where negotiation with central support services required additional query handling and reassurance to progress installation.

Page 6 of 29

4. One site requested the ability to install Rogō into a sub-directory, rather than the root of the web server. Although, the suggestion was made to run a virtual host, installation at this site could not proceed until the required functionality was developed.

5. Through the pilot, bugs were detected within the original Rogō installation script. These issues were fixed in later releases of the software and a more rigorous testing procedure introduced.

3.3.3 Servers

One institution had a problem securing an adequate server. It would appear that there was a dilemma around whether to buy a new server specifically to run Rogō or re-purpose an existing one. Before the system is piloted it is difficult to say whether an institution will wish to use Rogō in the long-term. A premature rush to buy a lot of new hardware could result in financial waste. This particular partner installed on a test Windows platform using a WAMP setup. Although the development team use Windows to develop Rogō, WAMP was not found to be a stable platform for high availability 24/7 deployment. After some discussions with the development team, the institution successfully migrated to a virtual LAMP stack.

3.3.4 Integration

Integration with third party system created challenges for most partners. For example:

AuthenticationOne partner tried to connect Rogō to their LDAP server. A telephone conversation confirmed that the Rogō software successfully connected with the LDAP server but was then unable to look up specific users. Unfortunately it was not possible to solve this configuration difficulty over the telephone. On the other hand, a site visit to a different partner did result in a successful LDAP connection and configuration. The key here was a knowledgeable member of the Rogō development team talk face-to-face with a member of information services who knew the structure of the local LDAP installation.

Another partner requested WebAuth as an alternative to LDAP. This is a classic open source conundrum – does the partner write their own solution before they have successfully piloted the system, or will the lack of WebAuth seriously affect the quality of the pilot. Other non-partner institutions have also enquired about CoSign and Shibboleth.

Student RecordsAt the time of writing no partners have integrated their Student Management System (SMS) for student module enrolments. This is most likely a result of a lot of higher education institutions using their own bespoke solutions, a situation encountered by the University of Nottingham when it visited a number of UK universities in the summer of 2010. One partner expressed a desire to connect its SMS to Rogō but development work is needed to export the data in a format in which Rogō can understand.

3.3.5 Customisation

A valuable lesson was learnt when one partner upgraded Rogō and subsequently lost its logos and local branding when the directory structure was overwritten. The Rogō development team have now moved local branding into the /config directory which is not overwritten in upgrades. Additional upgrade instructions have also been added to the wiki.

3.3.6 Standards

Standards were an area which was much harder to implement than expected. For example:

IMS Question & Test Interoperability (QTI)The IMS Question & Test Interoperability (QTI) 1.2 standard was difficult and time-consuming to implement. Several weeks have been spent working on import routines using QTI formatted questions from WebCT as a test platform. A lot of the difficulties surrounded the different approaches taken by assessment systems and the QTI standard itself. The assessment systems could be described as ‘physical’ representation of questions (i.e. MCQ), whereas with QTI it is a ‘logical’ representation (i.e. a collection of options with one correct answer). This mis-match between physicality and logicality was highlighted, in particular, in a couple of ways going between WebCT and Rogō. WebCT has a ‘Jumbled Sentence’ question type which is a paragraph of text with various dropdown menus replacing certain words with a variety of options. It also has a ‘Fill-in-the-

Page 7 of 29

blank’ question type which is a paragraph of text with some empty textboxes interspersed. In Rogō both question types are represented by what Rogō calls ‘Fill-in-the-blank’. The empty textboxes or dropdown menus are just presentation settings within this question type. Of course this presents no problem going from the two formats in WebCT to one format in Rogō, but outputting a ‘Fill-in-the-blank’ question from Rogō with dropdown menus results in errors when importing back in WebCT.

In a second example WebCT has a question type called ‘True/False’. Rogō has one called ‘Dichotomous’ but these contain blocks of multiple true/false questions. Conceptually the WebCT true/false question type is similar to a two option Rogō MCQ question. At the beginning of the project the development team had a discussion about the best way to solve this problem. The option of detecting WebCT as the source tool and automatically converting True/False to MCQ was discussed. In the end it was decided that if a new single item True/False question was created in Rogō the WebCT questions could import directly and it would be easier to track as a separate question type once in Rogō instead of merging into the rest of the MCQ question bank.

IMS Learning Tools Interoperability (LTI)The development team investigated LTI 1.0 as a way of providing a secure single sign-on launch for any LTI conformant VLE. At Nottingham, Moodle 2.2 (the institutional VLE) was used as the test platform.

A lot of time was taken up discussing how best to implement LTI at the Rogō end1. It soon became apparent that the main limitation of LTI is that although it supports authentication by the passing of shared keys it does not support sophisticated authorisation. Although a degree of authorisation is supported, this appeared to be limited to set roles. Rogō supports the role of ‘External Examiner’ but this is missing from the LTI specification. The LTI specification says that a user must be uniquely identified but should not contain identifying information. This posed the largest problem. For Rogō to effectively maintain security it needs to unambiguously identify the current user and calculate their permissions. The solution to this problem was a slight compromise. A new table was added to Rogō to hold LTI/user pairings. The first time a user launches through LTI, and there is no record, Rogō asks the user to log in as normal. The next time the user launches through LTI the system looks up the LTI user ID and translates this into a Rogō user ID which it can then use to work out permissions.

3.3.7 Support

General SupportSupport, in some cases, was more difficult than expected. An example of this would be when one site reported a feature not working as expected but when the development team investigated on their server everything was fine. There are probably two main contributing causes to this problem: 1) it can be difficult to describe exactly where and how an error occurs, and 2) the version of Rogō used at the University of Nottingham sometimes differed slightly from the latest release. To solve the first problem, one partner site created a video clip of the interaction leading up to the problem. This was certainly an efficient way to highlight to the development team unambiguously where the problem lay. The second problem can probably be solved by a more rigorous approach to versioning. Minor fixes should be more clearly labelled with either a future point release or possibly adding letters. Knowing more precisely which version a site is using would allow the development team to investigate the code in the relevant branch from the SVN. However, this raises an interesting issue for the future – should bug fixes be applied only to the latest version of the software or to past versions as well? Obviously, if bugs were fixed in past versions too then a clear statement should be available detailing how far back fixes will be made.

Summative Exam SupportTwo of the project partners, and two non-project institutions, have raised the issue of support for Rogō. This is not support in terms of development but more support in the context of summative exams. The perception appears to be that installing a system for formative exams is fine, but running critical high-stakes summative exams online requires potential support contracts in case of emergency. Several institutions have mentioned that they have support contracts with commercial e-assessment providers and such contracts are attractive in mitigating against risk. This support issue appears to be one of the more important issues which potentially could hold back the expansion of the Rogō OSS community. The use of Rogō at the enterprise level for mission critical operations appears to be different from that of say an open source app which users may download and use with less thought about risk.

3.3.8 Wider Community Engagement

1 http://comms.nottingham.ac.uk/learningtechnology/2012/05/14/lti-in-rogo/Page 8 of 29

The main findings from wider community engagement appear to be: 1) targeted e-assessment events such as conferences and workshops are good at whetting people’s appetites, and 2) personal communication initially via email and then possibly site visit appears important in the selection process an institution needs to go through. Unlike smaller-scale open source projects, the enterprise nature of Rogō coupled with the importance/risks of e-assessment seem to be steering potential institutions to make personal contact with the vendor (core development team at the University of Nottingham). In many ways this is related to the issue of support mentioned above – institutions need to know where their enterprise level software is coming from and just to say that it comes from this download URL does not appear to be sufficient.

3.4 Immediate Impact

3.4.1 Wider user-base

The most obvious impact of the project has been the creation of a larger community of users which are able to find bugs and other issues quicker. Several items have been posted as tickets in the TRAC system. These can be read by the Rogō development team and fixed if they are bugs or scheduled into a roadmap if they are usability issues or feature requests. This has enriched the quality of the software for both the University of Nottingham and its partners.

3.4.2 Diverse Marketplace

Although there are a number of different solutions for formative assessments, there are far fewer enterprise level high stakes summative systems available. Personal communication with two institutions has confirmed that there have been performance issues with the market leader in this field. One immediate impact of releasing Rogō as open source software has been an increased choice in the marketplace. Rogō is also one of the few assessment systems to focus on support for the whole summative examination lifecycle. Many tools concentrate on question banks and exam delivery. Rogō supports both of these but add specific support for external examiners, standards setting, item performance analysis and curriculum mapping.

3.5 Future Impact

In personal communication with one partner site they refer to a changing culture to e-assessment and open –source software. They have additional schools showing interest in the project and are set to pilot a summative exam in January 2013. If the experience of the University of Nottingham is repeated this move online is potentially much larger than simple presentation of questions online. For the staff, there are many phases of the summative assessment lifecycle which Rogō supports: external examiner reports, question analysis, question exclusion, objectives mapping, etc. Also, the move to online delivery also opens up new possibilities in question writing which can impact students. For example, audio and video can be integrated with many question types. Interactive question types such as the Area question, Image Hotspot and Labelling can be easily created from templates. However, changing a culture can be a slow process and it will be something that the partner site itself will need to monitor as it works out the best way to take advantage of the new e-assessment possibilities.

4 Conclusions

4.1 General Conclusions

It would appear from personal communication with other projects and from the experience with the Rogō project that building a development community relies on a strong user community. Several partners suggested features and some code ideas but none became active developers during the project. It would seem that for mission-critical systems such as e-assessment, faith needs to be built in a particular system before further development time will be invested. While this finding may appear quite self-evident it should perhaps be used to guide future projects embarking upon an open source route. Initially the focus should be on fostering a strong community of users who are actively engaged in the software’s features and use, before starting to build a strong community of developers. The techniques for fostering the user community may be different to

Page 9 of 29

those best to foster a development community. Also, it must be appreciated that for some institutions they may never make the transition, for a variety of reasons, from user to developer.

4.2 Conclusions relevant to JISC

Implementing a large enterprise level e-assessment system is a complicated process involving coordination between multiple parties: students, academics, student records, IT services, etc. Depending on local pressures and priorities this can take some time to achieve. Most HE institutions mainly engage with summative exams in January and May as the core exam periods. With the project starting late September, coupled with the inevitable pressures at the start of the new academic year, finding time to prioritise a new open source project was difficult for several of the partners. With large enterprise level projects it would be beneficial to begin the projects in May to allow for suitable preparation time over the summer ready for the start of the new academic year in September.

5 Recommendations

5.1 Better Installation

Getting started for many installations was a time-consuming process. The Rogō development team aims to ease this problem by:

1. Documentation – The wiki will be extended to show all the steps and aspects to be considered for an institution thinking of installing Rogō. This will probably be written from two different starting points: a) an individual school thinking of adopting Rogō, and b) a central IT support unit wish to install Rogō as a service across an entire institution.

2. Virtual Machine image – A virtual machine (VM) image can be created which will package up the whole LAMP stack needed to run Rogō.

3. Hosting – There is potential to explore options around providing Rogō as a hosted service. There could be potentially two ways to do this: 1) partner with an existing company who has hosting experience, for example, one of the Moodle hosts, or 2) host on a platform at the University of Nottingham.

5.2 Support Contract

In the context of high stakes summative assessment, support contracts appear to be an important consideration. Two out of the five partners asked about contracts plus two non-project institutions. The creation of an appropriate contract is a clear recommendation which would make Rogō more attractive as a serious enterprise level piece of software.

5.3 Authentication

Authentication is at the heart of a secure e-assessment system. LDAP has been supported ‘out of the box’ for several years, but several institutions have enquired about alternative such as: CoSign, WebAuth and Shibboleth. Failure to integrate Rogō with a central authentication solution could become a barrier to some institutions adopting the system. Given that it would be unfair to expect the University of Nottingham to supply off the shelf solutions for any authentication eventually, a useful middle-ground would be the re-configuration of the authentication code within Rogō to be more modular. This would make it easier for other institutions to programme their own authentication modules and contribute these back to the project.

Although no one has yet asked for this, a similar modular approach could be taken regarding database integration. If one institution wished to use Oracle, for example, instead of MySQL, then a modular database library could make this much easier.

5.4 Development

In personal communication with other developers from other institutions and a point highlighted in the Rogō Openness Evaluation report (see Appendix 8.3), it was recommended that the source code should be made directly available from the version control system. Currently only the source code from the last released version of the software is available for download. This presents a challenge for external developers to extend

Page 10 of 29

Rogō. Currently they would be extending the old version, not necessarily the version in development by the core development team. A recommendation would be to open up the SVN in read only mode for partners to download the latest copy of the code while in development. A potential drawback, though, would be at times when significant architectural changes could cause certain unmodified scripts to break. However, it is expected that with clear communication these architectural changes could be publicised and due warning given to any potential developers. With potentially more developers contributing code, this necessitates a more rigorous approach to testing. It is recommended that a greater use of automated testing could be run more frequently for earlier defect detection.

6 Implications for the future

6.1 Sustainability

The member of staff recruited to support Rogō OSS through the JISC funding will continue in post through University funding and is an integral part of the Rogō development team. This should provide continuity in support as the Rogō project continues beyond it official JISC funding period. The University of Nottingham is also in communication with OSS Watch to explore further sustainability models and options with an aim at expanding and robustifying the market.

6.2 Increased Openness

The Rogō Openness Evaluation report (see Appendix 8.3) and personal communication with OSS Watch recommended a number of priorities to make the project more open and potentially easier for developers to become involved with. The report broke the recommendations into a number of areas including: legal, data formats & standards, knowledge, governance and market. The Rogō development team will review and implement as many as is practical.

In addition to the report’s recommendations, one aspect that was highlighted by working for a few months with an outside contractor was documentation regarding the design/architecture of Rogō. Although the contractor was managed and directed by the Rogō development team it was a good opportunity to observe how a potential outside developer would interface with the project. A clear gap in the documentation is around design – code architecture and UI design. The intention is to expand the wiki to clearly document how the overall system is designed following Garrett’s (2003) framework.

6.2.1 Increased Integration

Related to an increased openness in governance is a desire by the University of Nottingham to align its own e-assessment governance with that of the open source community. The Rogō Development Team believes there are advantages for the community to see University of Nottingham requirements and vice versa for the university to see the community’s needs. It is hoped this will avoid potentially duplicated effort and share and spawn new development directions.

6.3 Support Contract

The University of Nottingham will explore the options around providing support contracts for institutions interested in running summative examinations. This may include working with interested parties to ascertain the appropriate level of support required and the terms under which this could be offered. There is also work to be done regarding what would be required for the University of Nottingham to offer this support and/or partnering a third party who could provide this.

6.4 Testing Processes

Observations from the University of Nottingham, the project partners and from the Rogō Openness Evaluation report have all pointed to the benefits of an enhanced test and release process. Work is beginning on several fronts to decrease possible software defects:

1. Review of test plans in TRAC as functionality changes with each version of Rogō. The intention is that some of the tests may actually be removed as they are covered by unit testing or Selenium.

Page 11 of 29

2. A refactoring of code to a more object-oriented style and the creation of unit tests that can verify the correct functioning of smaller units of code. With some parts of the system around ten years old, there is work to be done rationalising legacy code in a more modern style.

3. Work is underway to rewrite the authentication mechanism to a form-based method so that automated Selenium tests can be written.

The hope is that the unit tests can catch logic errors, the Selenium testing will catch user interface problems and the manual test cases in TRAC will be a final quality control check.

7 References

Garrett, J. J. (2003) The Elements of User Experience: User-Centered Design for the Web.

Page 12 of 29

8 Appendices

8.1 Evaluation QuestionnaireA short questionnaire was sent out to all partners consisting of the following questions:

1. What was learnt from your time on the project?2. How have your views/attitudes altered to e-assessment?3. How have your views/attitudes altered to open source software?4. Will you be continuing to pilot Rogō in the coming year?5. Are there barriers from moving from the piloting phase to a production service phase?6. Do you need more help from the Rogō development team, more management buy in, more academic

or student buy in, etc?7. Are there significant gaps in the functionality of Rogō which need addressing?

When reading answers to the questionnaire it is worth appreciating the different contexts that each partner was working under:

Partner Main Contact Usage/Potential ScopeUniversity of Oxford Individual web-technologist Medical SchoolUniversity of Bedfordshire Learning Technology Team University-wideUniversity of East Anglia Individual academic + Information

ServicesMedical School

University of the West of Scotland Centre for Academic & Professional Development

University-wide

De Montfort University Individual academic Faculty of Health & Life Sciences

8.1.1 What was learnt from your time on the project?University of OxfordFrom our time working with Rogō, we have learnt that it is an ideal system:

Written in PHP and therefore easy for us to get to grips with Meets our needs in terms of authoring, hardware cost, performance and flexibility better than

our current QuestionMark Perception (QMP) system.

However, we have surprised ourselves by our reluctance to move from our current online assessment system, QMP, to Rogō. As we say in the attached paper (now accepted for Special Issue of the International Journal of e-Assessment), we have learnt that many of the same barriers to initial adoption of e-assessment systems apply to the move from one system to another (see 8.1.5 below). Our contingency measures are seemingly stricter than at some other institutions.

University of BedfordshireWhile a solution might work very well in one institution's context, the solution itself is often built within a context and infrastructure that is a requirement for its successful operation. Our interpretation of ROGŌ is that it is based very much on the background structures of Nottingham University (departmentally speaking) and that such structural tie-in would have required a lot of localised extra work (at the time of our initial engagement). Implementing ROGŌ and making it work effectively requires tight integration of technical and and alignment of teaching structures with the Nottingham model, as well as having resources devoted to teaching and supporting the new platform, for which we were unable to find the resource.

University of East AngliaAt our school we have very strong high level support and enthusiasm to move forward and implement Rogō after looking at a number of possible solutions. We have learnt that enthusiasm and support within the school are not enough to move forward with implementing an e-assessment solution. Our school ring fenced more than adequate funds to put in place the infrastructure and any support from yourselves that we might need. This was over two years ago now. It was not until within the last 2 months that we have actually got a fully operational version of Rogō in place. The main lesson for us is that you cannot underestimate the difficulties and time it will take to persuade a rigid and inflexible

Page 13 of 29

university wide IT provider to change and embrace ideas that are not off the shelf windows based systems.

Now that we are using Rogō, in a small way so far, I had somewhat underestimated the difficulty non campus based faculty and clinicians would have in using and accessing the system. Partly due to training issues and partly due to NHS firewall issues.

University of the West of ScotlandCommunication and planning are essential for a successful implementation. The essential contributors need to be fully aware of what is intended and what their contribution is expected to be. ICT can be a great aid in such a project but can also be a major obstacle. Having obtained agreement from all parties then a plan needs to be put in place that clearly identifies the milestones, responsibilities and timescales. Then communicate, communicate, communicate.

De Montfort UniversitySee Section 8.2 below.

8.1.2 How have your views/attitudes altered to e-assessment?University of OxfordOne of the most interesting changes has been the realisation that our processes and attitudes to online assessment have been very much driven by online assessment with QMP. One of the many useful things to come out of our very useful discussions with the Rogō team over the course of the project, is that there are other ways to approach online assessment.

University of BedfordshireNot at all as a result of working with ROGŌ. We have always been and continue to be committed to various forms of e-assessment (perhaps an unhelpful term in that it covers so many options). We remain committed to computer based tests and examinations but, whilst ROGŌ is an accomplished and feature-rich e-assessment solution we prefer working with the existing tool that staff are familiar with (Blackboard). E-assessment remains as crucial as before, but having more features is a lower priority than the integration, familiarity and stability that our hosted VLE environment provides. Lack of in-house development resource also steers us towards an outsourced solution as being easier to support.

University of East AngliaNo, we still think that this is the only way to move forward, and our perseverance that has got us to this stage backs that up.

University of the West of ScotlandOur views/ attitudes towards e-assessment have been positive and remain so. We would not rush implementation if it would have a negative impact on staff and student perceptions of e-assessment and as a consequence we may not have advanced as quickly as we may initially envisaged but equally we have not lost the goodwill of those involved.

De Montfort UniversityeAssessment is increasingly the only way to manage large groups of students with the climate of continual staff cut backs. So I am introducing more summative phase tests (to replace practical write ups and essays) and all my end of year examinations will include an optically marked MCQ section rather than manually marked short answers (plus some essays).

8.1.3 How have your views/attitudes altered to open source software?University of OxfordAs big users of open source software: packages (e.g Sakai); development environments (e.g. Eclipse); and development frameworks (e.g. CakePHP), we very much like the idea of using an open source assessment system.

We have become more aware of the importance of creating and growing a community of users and contributors to reduce the impact of any single institution’s decisions and to safeguard the longevity of software. This community is particularly important with high-stakes software such as Rogō as it is also the source of support for any problems, in place of the support plans provided by commercial vendors.

Page 14 of 29

We now understand the benefits of modular/plugin architectures which lower the threshold for others to contribute.

University of BedfordshireWe have always had a positive attitude towards OSS in principle. However, it does seem to work best in scenarios where it has a large base and is supported by many developers. In our experience it works best when supported directly by an in-house resource with good understanding of technology and ability to customise it to local requirements. In my opinion, it is vulnerable to ungovernable changes in resourcing - the loss of committed individuals in a project can lead to it becoming moribund and unuasable.

University of East AngliaI am a strong supporter of Open Source. The main drawbacks we have encountered are:

Lack of expertise to support or develop the software in house because of a lack of expertise in the programming languages used.

Frequent updates during early development (no worse than Windows patches). Difficulty in persuading a rigid IT department that open source has advantages.

University of the West of ScotlandIn the period that Rogō was being installed and piloted, other open source software was also being implemented. The University has in a few years changed its attitude completely from a commercial software dependent institution to an open source based institution. ICT have had to undergo a transformation in how they deal with supporting applications and this is not aided when applications fail to meet the expectations provided by commercial products. It is hoped that Rogō will help with this transformation over the coming years and it will meet staff and student expectations for stability and reliability.

De Montfort UniversityViews to open source have not changed – however getting the necessary support from central services to support departmental innovations appears limited. The challenge is to balance the workload of central services supporting potentially a myriad of systems with the desire for end departments to explore new systems.

8.1.4 Will you be continuing to pilot Rogō in the coming year?University of OxfordYes. We are in the process of rolling out our new Perception v5 system (already in pipeline when project started) but will resume piloting Rogō once that is working correctly.

University of BedfordshireI'm afraid we won't be taking this project forward. Having tried Rogō out, we've found that its requirements for setup are too far out of line with our own approach.

University of East AngliaWe will most defiantly be carrying on with Rogō, in fact we aim to move all of our written assessments over to the system over the next two years. We have also have interest from a number of other schools within the University wishing to use Rogō as well. We are planning on using Rogō in January for our first summative exam which we are very excited about.

University of the West of ScotlandWe will be continuing with Rogō and will be attempting to make it a mainstream item if the LTI link from Moodle can be made available.

De Montfort UniversityI will look at ROGŌ again to learn about the whole skill of writing MCQs to different levels of ability, i.e. more for me to learn the art of MCQ writing for which it was very helpful.

Page 15 of 29

8.1.5 Are there barriers from moving from the piloting phase to a production service phase?University of OxfordMany of the same barriers which hinder initial adoption of CAA also apply to the move from one e-assessment system to another. We have broken those that apply to us down, in the accompanying paper, into:

Resources and inertia Training/Dissemination of good CAA practice Policy integration Risk propensity/confidence in the system: software maturity and support

University of BedfordshireI believe we perceived the system as needing to be set up around schools and classes? We were unable to accommodate this need and felt that we could not manage the resource required to bring it to the kind of scale we needed. We already deal with Blackboard and have a complex data feed to populate that system. We were using QuestionMark Perception to go and not requiring any kind of authentication into that system, and, as I said, we could not manage to assign the resource we believed would be necessary to implement Rogō.

University of East AngliaOur main barriers are really outlined above. It's really a changing culture issue. We have the will in the right places, it will just now take time, money and man power to move forward.

University of the West of ScotlandThere is funding implications in that the pilot server would be incapable of handling the load from a full implementation and these needs to be addressed. ICT are on-board and few issues are foreseen as regards their contribution. Rogō would need to integrate with Moodle for any widespread usage to be considered.

De Montfort UniversityYes – technical support from central services. The level of support was disappointing and I don’t think DMU would be able to move forward at this time.

8.1.6 Do you need more help from the Rogō development team, more management buy in, more academic or student buy in, etc?University of OxfordOur main need is time. These are early days for us and we will need to build our confidence with Rogō, assuring ourselves that it does meet our reliability, security and support needs in particular. This will be through continued delivery of formative assessments - both ‘access anytime’ quizzes and more formal formative papers sat under examination conditions.

University of BedfordshireN/A

University of East AngliaYes, As we only recently got a working version of Rogō online we ned to consider where best to start with commissioning more support from the Rogō team. Academic and student buy in has been good. Persuading our technical support team is our big challenge.

University of the West of ScotlandHelp from the Rogō development team would be appreciated as there are still installation issues that need to be addressed and they also have experience of dealing with large scale usage which we have yet to experience. Management and staff buy in will be required in the usage of Rogō in a summative setting as the University has not as yet addressed the resource implication of large scale assessment by computer. Staff will need to be convinced that an alternative to text based responses can be valid but there are sufficient early adopters who should be able to move the agenda forward.

De Montfort UniversityNo more support at this time.

Page 16 of 29

8.1.7 Are there significant gaps in the functionality of Rogō which need addressing?University of Oxford

Assessment timer; Improved reporting; Reduced strength of link between paper types and settings; More modular/pluggable architecture to make community contribution easier; Possibility of paid-for support.

Of course, the beauty of OSS, is that we could develop these features ourselves.

University of BedfordshireSee section 8.1.5 above.

University of East AngliaWe would like to export an exam as an editable word document. We would also like to look at a bulk upload of questions from say an excel file.We use script concordance tests and it would be good to be able to write and standard set that item type.

University of the West of ScotlandThe functionality as it currently exists would meet most of our current needs but there is functionality that would assist staff when dealing with extended short answer e.g. highlighting specific phrases, organising answers by size, ensuring consistency of marking by allocating marks to all answers that meet a given criteria. This may already exist and we have not as yet encountered them.

De Montfort UniversityNo – I thought Rogō was superb and full of potential.

Page 17 of 29

8.2 Rogō Project Report - De Montfort University8th October 2012

Dr Viv Rolfe School of Allied Health SciencesIan Bloor ITMS (Information Technology and Media Services)

Our aim is to introduce more on-line multiple choice tests as part of formative and summative assessment for Biomedical Science and Medical Science students. Our year 1 cohort is approaching 200 students and has seen dramatic increases in number. So MCQ is attractive proposition to facilitate student assessment with tight staffing resources, and also to provide formative and diagnostic testing opportunities for students. I was therefore very grateful at the opportunity to be involved in the Rogō project.

Educational considerations

Involvement in the Rogō project provided a training opportunity for academic staff (including VR) to learn about MCQ both writing and design using Bloom’s Taxonomy. The support materials provided on Rogō are excellent, and there was a steep learning curve since the art of writing a good question is more of a skill than you would imagine.

The project also gave me the opportunity to introduce the idea of students learning about MCQs by writing their own questions as part of reflective practice to review lecture and practical sessions. Questions would then be edited by myself and distributed to the class. This served to 1) encourage reflective practice, 2) introduce alternative tasks and interactivity to teaching and 3) improve student experience of MCQ prior to the end of year examination.

Usability of Touchstone then Rogō

Touchstone 4.0 was installed onto the DMU server in February 2012 after some delay. A test student was set up. My intention was to compare MCQ test delivered via TouchStone (later Rogō) versus Blackboard and versus a paper copy by the end of semester in March 2012. However, I didn’t get as far as student testing.

After using TouchStone and setting up a test within it (Feb 2012) it was clearly very intuitive and logical software to use. It is simple to write and input questions. There is a good range of question styles available, and the ability to upload a photograph or image as part of a question both had advantages over Blackboard in which you cannot do so.

At this point in time, the use of TouchStone was a positive experience. However, technical problems were then encountered after trying to upgrade the system, and unfortunately student testing then ceased.

Technical Considerations

Installation of the later versions (4.1 and 4.2) onto the DMU server did not work (March 2012) and delayed further testing with staff and students.

Moving from TouchStone to a working version of Rogō proved to be a lengthy process. This was due to:

Pressure on resources in ITMS (there were some periods where other work had to take priority or where staff were on leave). ITMS has also gone through significant organisational change this year.

We were not able to install the official releases “out of the box” (i.e. without any changes to the source code) in DMU’s environment.

We tried version 4.1.1 in March, v.4.2 in April, and v.4.2.2 when it became available in May. In early June, a Rogō developer, connecting remotely, successfully installed it. One cause of the

difficulties (highlighted by the developer) is that some of the components in DMU’s software environment were old.

  

Page 18 of 29

Pilot Testing of Rogō with Students

Finally a version was installed, questions uploaded and a very simple pilot test conducted with students in October 2012.

The process for uploading student data and enrolling students worked well.

Again, the process of writing questions and setting up a test was very straight forward for an academic member of staff with no technical support.

Technically it was difficult to set up a course / module and then associate it with a test. The help section wasn’t clear, but like all things, once the process is understood it is simple enough. This did require technical input.

Using Rogō for a test was flawless and from a student perspective and the test was presented clearly and the navigation logical.

Being able to access the student reports and data was again straightforward.

It was unclear whether there was a logout button, or any instructions on logging out? It’s important that users know that they need to close their browser completely when they have finished. If they just close the Rogō tab or window and leave other tabs or windows open, they remain logged in.  If this is done in labs where multiple users can access the computers, it would be a security threat.

Page 19 of 29

8.3 Rogō Openness Evaluation

1. Openness RatingThe openness rating is created from a series of questions that relate to good practice in software projects seeking to reach sustainability through the adoption of an open source licence and related business models. It consists of a comprehensive questionnaire that focuses on key areas of open development practice. This questionnaire is completed by one of our evaluators. This produces a percentage score for each of five categories (legal, standards, knowledge, governance and market). This score gives an indication of how open the project is when measured against common practices in a range of successful open development projects.

It is important to realise that the goal is not to score 100% in each category. Not all projects intend to be maximally open. The intention is for these scores to highlight areas that the project team might want to review. If more openness in a given category is desired in order to support a given sustainability model then the scores will help guide the project team with respect to where improvements can be made.

Once an evaluation has been conducted a series of recommendations can be made that will improve the projects openness. These recommendations should then be considered by the project team. In discussing these recommendations the team will be forced to consider all aspects of openness with respect to the project and will therefore be in a better position to make optimal decisions.

It is not expected that all recommendations will be taken up. However, it is intended that there will be well-considered reasons for rejecting a recommendation. We are able to provide further assistance in evaluating individual recommendations and building a full strategic plan if necessary.

It is recommended that the project be evaluated periodically, with results being compared against previous results and recommendations. This allows progress to be monitored and appropriate course corrections can be made.

2. Rogō Openness Rating Version 1.0This is the first openness evaluation for Rogō and is conducted in order to provide a base line against a recommended level of openness within the project. The recommendations contained within this document are based on the objectives discussed with project team members. Prioritisation of these activities should be set according to resource availability and the strategic objectives of the project.

At this stage the future strategy for the Rogō project is reasonably unclear. During discussion with the project team there were some conflicting views with respect to the objectives set for the open source version of the Rogō application. Therefore this document seeks to make recommendations that will help move Rogō towards an optimally open project. It is therefore generic in its advice.

The one exception to this is that the project team expressed a specific desire to evaluate their release and audit processes. This report therefore prioritises recommendations that relate to these elements of project management.

It should also be noted that many of the evaluation criteria are subjective in nature. Consequently scores can vary between reviewers and even a single reviewer may attribute different scores as they become more familiar with a project. Small changes in subsequent reviews are often indicators of these variations. However, significant changes (5 percentage points or more) are usually a result of change within the project itself.

The results of this review should be used for planning resource allocations in the coming months. For example, a review might indicate that there is scope for improvement in both the governance and legal aspects of the project. Since most projects have a limited set of resources it is reasonable to expect one of these areas to be chosen as a focus of activity. A goal can be set, such as increasing the rating by 5 points in the next three months. Once the goal has been reached a new objective can be set. In this way ongoing openness reviews will help the project team focus on key areas of open development practice.Finally, readers should recognise that this document is designed to identify areas that can be improved. Consequently it can appear to have an unreasonable level of emphasis on items that are less than perfect. In order to keep the document short we do not discuss items that are well implemented. However, by examining the evaluation scores, it can be seen that in most cases there is more that is good than there is bad. Readers should bear this in mind before jumping to conclusions about the overall performance of the project team.

Page 20 of 29

2.1. Summary of RecommendationsIn the detail sections below a set of recommendations for each topic area are identified. Each recommendation is assigned to a category of either "priority", "secondary" or "implemented". Priority recommendations are ones that we believe should be targeted first in order to maximise the projects ability to achieve sustainability through its chosen route. Secondary recommendations are ones that we believe should be enacted once the priority recommendations have been realised. Implemented recommendations are past recommendations that have been enacted.

Since this is the first openness review of Rogō there are currently no implemented recommendations.

In this section we summarises the priority recommendations found in the detail sections below. For further information about why the recommendation has been made see the detailed discussion in the following sections.

2.1.1. Legal Document and implement a repeatable license auditing process

Clarify license conditions for Calculator component

Adopt an automated source header verification tool

2.1.2. Data Formats and Standards Document data standards used

Document development standards adopted

2.1.3. Knowledge Define and implement a release process, including QA processes

Provide read-only version control access

Ensure all technical decisions are justified and reported to the mailing list

Make the mailing list the go-to place for users and developers

2.1.4. Governance Define key roles in the project and identify decision making processes

Open the issue tracker to third parties

Provide access to development code and encourage third parties to submit patches

2.1.5. Market Clarify ability to redistribute complete system under the chosen GPLv3 license

Page 21 of 29

Engage third parties in active development/maintenance of the project

2.2. Detailed EvaluationIn this section we present detailed recommendations for each of the topic areas: Legal

- management of Intellectual Property within the project

Data Formats and Standards - focus on mechanisms for interoperability

Knowledge - knowledge management and sharing practises

Governance - strategy definition and decision making approach

Market - opportunities for commercial exploitation or reduction of costs by using project outputs

2.2.1. LegalThis section evaluates how open the software is from a legal standing. It is important that Intellectual Property is protected whilst also ensuring that third parties can reuse and engage with the project. Third party adopters will be looking for high scores in this section as they seek to minimise any legal risk when reusing software. An open source project should be seeking to score a minimum of 85%, closed source projects can afford significantly lower scores but some aspects of this section cannot be ignored in either kind of project.

2.2.1.1. SummaryEvaluated Score: 75% Suggested Target: 85%

The project is performing reasonably well in this section. However, there are a number of high priority items that must be addressed if the intention is to allow this software to be adopted under an open source licence.

2.2.1.2. ObservationsWhilst the licence is clearly stated on the web site and is included in some source code files there are a number of files lacking licence headers and there is no obvious reference to the licence in an initial checkout of the code. It is good practice to ensure that all source files have appropriate licence headers. It is also good practice to have a LICENSE file in the root of the source tree.

An audit of dependencies appears to have been performed although this is an informal human-only process. Consequently the results are incomplete. During review we found at least three dependencies which were not included in the audit (JQuery, JQuery Validator and ims-bti). Incomplete audits are likely to cause concern about the possibility of undiscovered licence compatibility issues. At best this will require potential re-users to conduct their own audits, at worst will discourage any reuse.

There is also concern about a documented but inadequately justified dependency on a calculator component that is not open source. There is no indication of whether third parties can safely reuse and redistribute this component. Again this is likely to be a blocker to any downstream reuse.

It recommended that the audit process be documented and that all releases be audited using this process. This process should include automated header scanning as well as human review. Full documentation of all dependencies and their licenses should be presented in a NOTICE file.

Page 22 of 29

2.2.1.3. Recommendations

2.2.1.3.1. Priority Document and implement a repeatable license auditing process

Clarify license conditions for Calculator component

Adopt an automated source header verification tool

2.2.1.3.2. Secondary Create and maintain LICENSE and NOTICE files in the root of the source tree

Clearly document all dependencies and their licences along with date of last full audit

Ensure all source files have license headers

2.2.1.3.3. Enacted recommendationsN/A

2.2.2. Data Formats and StandardsThis section explores the projects use of data formats and standards. In general, more openly available standards and formats make it possible for other applications to work with the same data. This helps with sustainability in that the data becomes useful outside the initial application and third party tools can more easily interoperate with the software. This, in turn, helps support early adoption since users are less concerned about application lock-in.

In open source projects all code is available for reuse and therefore it can be argued that a lower score is acceptable for open source when compared to closed source. Others might argue that this is dependent upon the type of open source licence adopted. We consider a score of less then 75% suboptimal in most open projects.

Closed source projects will usually aim to score more highly (90% or more) unless proprietary or restricted formats are an important part of the chosen business model.

2.2.2.1. SummaryEvaluated Score: 73% Suggested Target: 75%

Whilst adoption of appropriate standards is evident it is unclear whether these standards are all openly managed by independent standards bodies. It is also appears that there is no formal development standard being adopted which may make it more difficult for third parties to slot into ongoing development.

2.2.2.2. ObservationsClear documentation of the standards adopted and the process by which they are managed will serve to ensure all reviewers of the project are fully informed about the interoperability of the projects outputs and the ability for data to be shared. Similarly, the adoption of a recognised development process, e.g. XP SCRUMM, PRINCE 2, will enable third parties to navigate the development process more easily and this increase the chances of them feeling comfortable with the prospect of engaging with the project at a development level. This ensures that third party systems can be built and tested against future releases in a managed way.

It would also be useful to consider documenting a standards policy for the project. Such a policy will define what types of standards are considered acceptable within the project. For example: does

Page 23 of 29

the project accept de-facto standards? If so, what level of penetration is required to allow it to be considered a standard?

2.2.2.3. Recommendations

2.2.2.3.1. Priority Document data standards used

Document development standards adopted

2.2.2.3.2. Secondary Document and adopt a standards policy

2.2.2.3.3. ImplementedN/A

2.2.3. KnowledgeThis section evaluates the knowledge that is freely available within the project. Knowledge falls into two broad categories, user and developer. User knowledge is important as it promotes adoption, developer knowledge is important as it facilitates engagement.

Encompassed within the developer knowledge category is a sub-category of project management. This overlaps with the adoption of a project management standard evaluated in the Standards section. However, here we are more concerned with how the management processes seek to ensure knowledge is retained and shared. Consequently, at later stages of development, as community interest begins to grow, it often becomes necessary to document project management processes beyond technical management aspects.

An open source project should be aiming for very high scores in the developer documentation whilst a closed source project will focus more on user documentation. An open source project will typically aim for at least 75%, whilst a closed source project may aim much lower (sometimes as low as 25%).

2.2.3.1. SummaryEvaluated Score: 13% Suggested Target: 75%

There is very little visibility into the development practices or the reasoning behind the existing system architecture and implementation. Furthermore there is minimal user documentation. This lack of knowledge about the project and its development is a significant barrier to third party engagement.

2.2.3.2. Observations

2.2.3.2.1. Welcoming and engaging third partiesAt first site it would seem that the project is geared up to guide third parties, whether they are users or developers. However, on closer examination user documentation is sparse and developer documentation is both sparse and inaccurate. It is extremely hard to provide entry level documentation that is both complete and accurate. A good strategy is to make it absolutely clear that the team is available to help fill in any gaps.

The project started a public mailing list in March 2012. However, this is not prominently linked from the site. Generally speaking it is a good idea to provide this information at the start of any

Page 24 of 29

page not known to be fully complete and accurate. A simple statement to the effect of "we are constantly working on our documentation, please bring your questions to our mailing list so that we might improve these docs for those who come after you" can have a significant impact on both expectations and engagement.

2.2.3.2.2. Visibility into project managementThere appears to be good use of the issue tracker. Milestones are defined and issues are attached to these milestones. This is a good way of communicating expectations to potential users and contributors. However, this is somewhat undermined by the fact that past releases have been made that have not completed all issues assigned to them. Whilst it is to be expected that some issues might slip it is also expected that they will be reassigned to subsequent milestones. Failure to do so makes future road maps unreliable with respect to third party engagement planning.

It is also evident that the majority of discussions relating to the project architecture, design and implementation are made outside of the public mailing list and the issue tracker. Furthermore, the results of these discussions are not communicated to any potential community members. This means that it is impossible for any third party to evaluate the validity of these decisions, their impact on intended use outside the decision makers sphere of influence. Furthermore, it prevents third parties from contributing appropriately to decisions.

This lack of opportunity for third parties to engage, while limiting the expertise available within the project community, also prevents trust being developed with third parties. This lack of trust can result in third parties being unwilling to engage for fear of the project taking a direction that is not compatible with their own needs. To remedy this situation all technical decisions should be made on the public mailing list. This does not prevent developers meeting to discuss options, it simply means the results of such discussions are posted to the public list and a short period of time is given to allow community feedback.

Whilst this may feel like a waste of time while there are no active members of the community there are a number of hidden benefits. Firstly, the act of reporting decisions and their justifications to the public list results in an improved level of documentation about the architecture and its implementation. Secondly, any third parties considering engagement with the project are presented with clear and concrete ways in which they can engage.

2.2.3.2.3. Release and Quality Assurance ProcessesWhilst there are extensive test scripts that the development team seek to apply before each release it is evident that these are not adequate for the projects needs. During evaluation of the project a reviewer attempted to install the system. A problem was quickly discovered that prevented installation on any new machine. Such a fundamental problem should have been caught during testing prior to release.

It is a credit to the development team that a search of the issue tracker indicated the problem had already been identified. Furthermore the issue reported that the problem had been resolved and thus a fix was available in the version control system. However, there was no indication of what the problem was, no link to a patch file to resolve the problem and no way to check-out the latest development code from the version control system.

A mail to the development team resulted in the production of a new release. This release was made very quickly (good) but the speed suggested that the release probably wasn't tested against the published test scripts. Further evidence that the informal manual testing process employed by the project is inadequate.

The correct process would have been to encourage the interested party to checkout the latest version from source control. This seems to be impossible since we could not find any documentation on how to achieve this and when asked the development team simply provided a new release package.

Page 25 of 29

Enabling third parties to work with the latest development tree provides an additional level of testing over and above that provided by the development team (a resource that is almost free of cost). Furthermore, direct access to the fix would have prevented an unnecessary delay in the third parties project while they waited for the new release to be cut.

Whilst a new release to fix a critical bug is important it should be cut after the new code has been fully tested. This would mean that the short term impact on Rogō project resources would have been reduced since a new release was not necessary to satisfy the third parties needs. As it turned out we subsequently identified another problem that could have been rectified before the release. Possibly with a patch to either code or documentation from ourselves.

As it turned out we failed to install the new release as well. In this case our exploration of the problem seemed to indicate a configuration problem with our environment. This issue was not documented but web searches indicated it is a common problem for projects like Rogō. We suspect the problem was in documentation but since our time for evaluating the install had run out we ceased to explore further. Had we been a real user of the system this would have been a significant missed opportunity to improve testing and documentation of the 4.2.4 release. An opportunity that may not have been missed if access to subversion were provided.

Continuous Integration

Very few of the tests adopted by the project are automated tests, yet on a cursory examination it would seem that the majority could become automated. The development team indicated, in interview, that there was some intention to move towards more automated testing. The importance of this objective cannot be underestimated. Human driven testing is error prone, time consuming and tedious. Automated testing, on the other hand, allows for continuous integration testing, which in turn can act as an early warning system for new problems.

It has been shown that the early problems are identified in the development process the cheaper they are to resolve. A perfect example is the installation bug discussed above. An automated install test script would have identified this before release and saved considerable time on the evaluators part, as well as within the development team.

Version Control

As discussed above there is no documented way to checkout the code from version control. It is good that third parties can review the code on the website. However, being unable to checkout the code prevents people from engaging with development activities, since it complicates the process of generating and applying patches. Whilst users are unlikely to want to work with the latest development code, those seeking to integrate new features or external tools will want to ensure that they can test against the future code rather than the current code.

An inability to identify problems introduced by code modifications early serves to increase the costs of adopting the Rogō application. This in turn reduces the potential for third party contributions in the form of release testing, documentation, patches and design evaluation. One of the key advantages of open source software is that, when managed openly, it reduces the costs of interoperability and collaboration for all parties.

2.2.3.3. Recommendations 2.2.3.3.1. Priority

Define and implement a release process, including QA processes

Provide read-only version control access

Ensure all technical decisions are justified and reported to the mailing list Make

the mailing list the go-to place for users and developers

Page 26 of 29

2.2.3.3.2. Secondary Implement automated tests

Embed continuous integration into the development process

Re-assign slipped issues when milestone release dates are approaching

2.2.3.3.3. ImplementedN/A

2.2.4. GovernanceThis section examines the way the project is managed and how decisions are made. This is a very important section, especially if the project is seeking to engage third party developers, either through direct contributions or through integration in third party tools.

It is necessary to reassure potential contributors that the project is cleanly managed and that any contributions they make will be cared for appropriately. Similarly, significant contributors need to be reassured that they will be able to influence the management of their contributions and thus take on some of the responsibility for project survival.

In addition to ensuring third party contributors are appropriately empowered existing project members will want to ensure that they are able to say "no" when necessary. That is, they will want to ensure that they are able to maintain strategic control over the project. The projects governance model defines the mechanisms for reaching a decision and the options available to those seeking to engage with the decision making process.

Since project governance involves transparency there are significant links between the governance and knowledge sections. Often the implementation of improvements in one of these topic areas will have a related impact on the other. For this reason these two areas are often good candidates for an early focus of effort.

Typically a project will seek to score at least 85% in this section.

2.2.4.1. SummaryEvaluated Score: 16% Suggested Target: 85%

There are almost no documented processes relating to the strategic management of the project and the decision making activities at a tactical level. That is, the operation of the project is completely opaque. It is entirely possible that the internal governance of this project is rock-solid. Certainly, the outputs appear to indicate a healthy project. However, this lack of transparency can only serve to discourage third party engagement with a project that offers no support or warranties.

2.2.4.2. ObservationsOpen source software provides no support or warranties. It is therefore very important that third parties are able to assess any risk associated with engaging with the project. As a user they want to be assured that the project will not die. As a potential contributor they want to be assured that they can engage with the collective management and maintenance of the project. For this reason it is critical that the project governance is both transparent and well documented.

Failure to be transparent in project governance whilst also offering no warranties or support is a sure way to ensure third parties are reluctant to adopt open source software. Put simply the risks of adoption are unmanageable without visibility into the management processes. Even where visibility is provided most participants will want to be assured that they can influence project strategy within clearly defined boundaries if they are to adopt the solution in mission critical domains.

Page 27 of 29

These problems further compound problems presented by third parties' inability to easily obtain the latest development code from version control. Similarly, these third parties are unable to report problems they face via the issue tracker. These two barriers to engagement make it is impossible for third parties to resolve their internal issues in a cost-effective manner (lack of access to development code) whilst also being unable to evaluate the likelihood of core developers resolving the issue in a timely manner.

2.2.4.3. Recommendations

2.2.4.3.1. Priority Define key roles in the project and identify decision making processes

Open the issue tracker to third parties

Provide access to development code and encourage third parties to submit patches

2.2.4.3.2. Secondary Engage third parties in project management where appropriate

Remove Nottingham as a single point of failure for the project

2.2.4.3.3. Enacted recommendationsN/A

2.2.5. MarketThis section explores the potential for organisations to generate revenue or realise cost savings by adopting and contributing to the open source project. For open source projects that seek to share the costs of development the goal is to score as highly as possible (a target of 70% should be the minimum for such a project). Closed source projects will often seek to limit the kinds of opportunities available to third parties and would typically expect to see a much lower score in this section.

It is important to understand that this section does not evaluate the potential for commercial engagement/cost reductions presented by the software, it merely examines the general opportunities and generic risks that exist.

2.2.5.1. SummaryEvaluated Score: 38% Suggested Target: 70%

Due to some uncertainties relating to IP management in the project there are, potentially some costs involved with adopting the software. At the very least these costs will be a formal audit of IP. In addition the project is packaged as a complete product that has a very limited market, this limits the potential for adoption beyond this niche.

2.2.5.2. ObservationsWhilst the project is released under an open source license it is not clear that all code can be redistributed under the terms of this licence (see Legal section above). This may not affect use within a single institution but is likely to impact any organisation planning to offer support and warranties based on the product. Even if this is not the case a full IP audit would be necessary before such a company began trading.

Another barrier to potential business engagement with the project is the size of the market to which the solution is being applied. Again, this is not a barrier to individual organisation seeking to reduce

costs in a specific area. However, it can present barriers to commercial engagement with the project outputs. It is possible that some features of the project would be of value beyond this niche market. If this is the case the project team may consider separating these areas of functionality out into separate modules that can be used independently of the complete product. In doing so new opportunities for sharing development and maintenance costs may present themselves.

Page 28 of 29

The final market related barrier is the fact that Nottingham is the only visible participant in the project. This means that a change in business objectives in Nottingham will likely reduce ongoing maintenance of the project to zero.

2.2.5.3. Recommendations 2.2.5.3.1. Priority

Clarify ability to redistribute complete system under the chosen GPLv3 license

Engage third parties in active development/maintenance of the project

2.2.5.3.2. Secondary

Consider separating out reusable libraries for potential reuse beyond the Rogō product

2.2.5.3.3. ImplementedN/A

2.2.6. SummaryIn conducting this review we were specifically asked to examine the release and audit processes of the project. This document therefore focuses on these aspects, although many other areas for potential improvement have been identified. It is recognised that producing a maximally open project takes time and effort. Often it is hard to justify the time it takes to resolve the issues we have identified in this document.

It is hoped that the project team will be able to use the feedback contained in this document to prioritise their resources as appropriate for their strategic objectives. It is not expected that all recommendations will be fully adopted, however, we encourage the project team to consider the strategic implications of each recommendation and assign resources accordingly. OSS Watch have blog post that explores the returns that can be expected when a truly collaborative environment is created.

2.2.7. About the authorsThis document is created by OpenDirective, a consultancy company specializing in open source software development and adoption. The OpenDirective team delivering these evaluations is led by Ross Gardler, Vice President of Community Development at The Apache Software Foundation. He is supported by Steve Lee a contributor on many open source projects, most notably at the Mozilla Foundation and the GNOME foundation.

OpenDirective have hands-on experience of a great many open source projects with many different management styles and sustainability plans. We advise projects ranging from small academic projects through to large multinational projects involving multi-billion pound corporations.

OpenDirective's engagement with the Rogō project is managed by OSS Watch the UK advisory service to the Higher and Further Education sector. OSS Watch provide a detailed understanding of the application of open source models to the academic sector.

Page 29 of 29