DARTS 2000 online diabetes management system: formative evaluation in clinical practice

10
Journal of Evaluation in Clinical Practice, 9, 4, 391–400 © 2003 Blackwell Publishing Ltd 391 Blackwell Science, LtdOxford, UKJEPJournal of Evaluation in Clinical Practice1356-1294Blackwell Publishing Ltd 20039 4391400Original ArticleFormative evaluation of DARTS 2000C. Pagliari et al. Correspondence Dr Claudia Pagliari Lecturer in Psychology and Informatics Tayside Centre for General Practice University of Dundee Kirsty Semple Way Dundee DD2 4AD UK E-mail: [email protected] Keywords: diabetes, evaluation, informatics, primary care Accepted for publication: 11 October 2002 DARTS 2000 online diabetes management system: formative evaluation in clinical practice Claudia Pagliari BSc PhD, 1 Deborah Clark MBChB MRCGP, 1 Karen Hunter BA RGN, 2 Douglas Boyle BSc, 2 Scott Cunningham BSc, 2 Andrew Morris MSc MD FRCP 2 and Frank Sullivan MBChB PhD MRCGP FRCP 1 1 Tayside Centre for General Practice, University of Dundee, Dundee, UK 2 DARTS/MEMO Collaboration, University of Dundee, Dundee, UK Abstract Rationale, aims and objectives Failure to engage in user-informed evalua- tion of emergent health informatics tools can have negative consequences for future implementation, related both to poor usability or clinical utility and to suboptimal stakeholder buy-in. This paper describes a formative evaluation in primary care of a multifaceted, web-based resource for dia- betes management. The primary aims were to assess the usability and utility of the prototype in order to inform system refinements prior to implementation, and to investigate barriers and facilitators to system use so as to aid the development of a tailored implementation plan. Methods A mixed-method approach involving survey, remote observa- tion, semi-structured interviews and electronic feedback. Context One Scottish local health care cooperative comprising five general practice sur- geries and their staff. Results A survey following temporary exposure to a dummy site revealed high levels of computer familiarity, welcoming atti- tudes and positive ratings of usability, format and utility. Comments mainly addressed content accuracy, feature suggestions and usability issues. Key barriers and facilitators to use included time and training. Remote obser- vation following access to live clinical data enabled profiling of usage by individual and professional group. In the 3 month observation period administrators were the most frequent users, followed by GPs and nurses, with clinical and data entry screens accessed most often. Semi-structured interviews with key respondents sampled by professional group and usage frequency provided richer qualitative information on barriers and facilita- tors to use, patterns of system integration into work routines and system usability, content and utility. Content analysis of electronic feedback revealed mainly technical queries and general expressions of satisfac- tion. Conclusions Evaluation informed a number of important and unforeseen improvements to the prototype and helped refine the imple- mentation plan. Engagement in the process of evaluation has led to high levels of stakeholder ownership and widespread implementation. Introduction Political context Several key policy documents published by the Scot- tish Executive within the last 5 years recognize dia- betes as an important national priority and identify enhanced communication between primary and sec- ondary care and electronic information sharing as major facilitators to improved patient care (SODH 1997, 1998; SEHD 2000). One of the concepts pro- moted in these and other national documents is that

Transcript of DARTS 2000 online diabetes management system: formative evaluation in clinical practice

Page 1: DARTS 2000 online diabetes management system: formative evaluation in clinical practice

Journal of Evaluation in Clinical Practice,

9

, 4, 391–400

©

2003 Blackwell Publishing Ltd

391

Blackwell Science, LtdOxford, UKJEPJournal of Evaluation in Clinical Practice1356-1294Blackwell Publishing Ltd 20039

4391400

Original Article

Formative evaluation of DARTS 2000C. Pagliari et al.

Correspondence

Dr Claudia PagliariLecturer in Psychology and InformaticsTayside Centre for General PracticeUniversity of DundeeKirsty Semple WayDundee DD2 4ADUKE-mail: [email protected]

Keywords:

diabetes, evaluation, informatics, primary care

Accepted for publication:

11 October 2002

DARTS 2000 online diabetes management system:formative evaluation in clinical practice

Claudia Pagliari BSc PhD,

1

Deborah Clark MBChB MRCGP,

1

Karen Hunter BA RGN,

2

Douglas Boyle BSc,

2

Scott Cunningham BSc,

2

Andrew Morris MSc MD FRCP

2

and Frank Sullivan MBChB PhD MRCGP FRCP

1

1

Tayside Centre for General Practice, University of Dundee, Dundee, UK

2

DARTS/MEMO Collaboration, University of Dundee, Dundee, UK

Abstract

Rationale, aims and objectives

Failure to engage in user-informed evalua-tion of emergent health informatics tools can have negative consequencesfor future implementation, related both to poor usability or clinical utilityand to suboptimal stakeholder buy-in. This paper describes a formativeevaluation in primary care of a multifaceted, web-based resource for dia-betes management. The primary aims were to assess the usability and utilityof the prototype in order to inform system refinements prior toimplementation, and to investigate barriers and facilitators to systemuse so as to aid the development of a tailored implementation plan.

Methods

A mixed-method approach involving survey, remote observa-tion, semi-structured interviews and electronic feedback.

Context

OneScottish local health care cooperative comprising five general practice sur-geries and their staff.

Results

A survey following temporary exposure to adummy site revealed high levels of computer familiarity, welcoming atti-tudes and positive ratings of usability, format and utility. Comments mainlyaddressed content accuracy, feature suggestions and usability issues. Keybarriers and facilitators to use included time and training. Remote obser-vation following access to live clinical data enabled profiling of usage byindividual and professional group. In the 3 month observation periodadministrators were the most frequent users, followed by GPs and nurses,with clinical and data entry screens accessed most often. Semi-structuredinterviews with key respondents sampled by professional group and usagefrequency provided richer qualitative information on barriers and facilita-tors to use, patterns of system integration into work routines and systemusability, content and utility. Content analysis of electronic feedbackrevealed mainly technical queries and general expressions of satisfac-tion.

Conclusions

Evaluation informed a number of important andunforeseen improvements to the prototype and helped refine the imple-mentation plan. Engagement in the process of evaluation has led to highlevels of stakeholder ownership and widespread implementation.

Introduction

Political context

Several key policy documents published by the Scot-tish Executive within the last 5 years recognize dia-

betes as an important national priority and identifyenhanced communication between primary and sec-ondary care and electronic information sharing asmajor facilitators to improved patient care (SODH1997, 1998; SEHD 2000). One of the concepts pro-moted in these and other national documents is that

Page 2: DARTS 2000 online diabetes management system: formative evaluation in clinical practice

C. Pagliari

et al.

392

©

2003 Blackwell Publishing Ltd,

Journal of Evaluation in Clinical Practice

,

9

, 4, 391–400

of the managed clinical network, which promises todeliver improved efficiency and effectivenessthrough better sharing of information across andwithin health care teams.

Background to the DARTS project

The Diabetes Audit Research Tayside project(DARTS) was initiated in 1995 with the aim ofcreating a validated register of clinical records forall diabetic patients in Tayside, Scotland (Morris

et al

. 1997). This task was facilitated by the existingregional requirement, unique at that time, for allhealth service encounters to be tagged with a patient-specific identifier known as the community healthindex number, enabling record linkage. Manual val-idation of the database against practice records wasachieved by a team of trained outreach staff. Whilethe DARTS database was initially conceived as aresource for research and audit (its primary aimbeing to generate valid estimates of the number andcharacteristics of diabetics in the region), its poten-tial as a tool for improving diabetes management wasrecognized and it was soon used to produce paper-based reports for individual practices, summarizinglocal and regional statistics on diabetes management(e.g. number of patients receiving appropriate care).In conjunction with outreach facilitation, this wasused to encourage problem identification and targetsetting. While this approach was employed success-fully for 4 years, its potential impact was constrainedby asynchronous data feedback and input (at thetime, the latter was achieved by means of manuallycompleted optical character recognition forms) andnon-universal access to practice reports. In parallel, itbecame apparent that in addition to its other benefits,the system of contacts built up though the DARTSproject closely resembled the managed clinical net-work model.

The DARTS 2000 project aimed to take the exist-ing facility several steps forward by creating an acces-sible, web-based source of timely, patient-specific,practice-specific and regional data that could meetboth clinical and strategic decision support needs. Italso aimed to harness the capacity of such a systemto house other helpful management tools, such asguidelines, electronic images, patient informationmaterials and information on key services and per-

sonnel. Its potential as a common information gate-way for all clinical stakeholders was also recognized,hence it was labelled the Tayside Diabetes Network,shadowing moves to formalize the regional managedclinical network. The features of the prototype web-site are shown in Box 1. These were initially draftedby the DARTS programmers, based on the availabil-ity of existing data (records, guidelines, patient mate-rials, research information), with the assistance of asteering group containing representatives from pri-mary and secondary care and a patient advocate.

Why evaluate?

The historical lack of evaluation in health care infor-matics has been recognized as a major problem(Friedman & Wyatt 1997; Mitchell & Sullivan 2001;Sullivan, Pagliari & Mitchell 2002). Moreover, theappropriateness of the research questions and meth-ods commonly used in evaluation studies have beencalled into question by leading theorists, who arguethat controlled trials to assess clinical impact andcost-effectiveness not only miss important aspects ofa new innovation, but also seek to demonstrateeffects that may not be apparent until well beyondthe study period (Heathfield

et al

. 1998). It is becom-ing increasingly accepted that evaluation of new sys-tems is essential not only to demonstrate their

Box 1 Features of the DARTS 2000 prototype

Patient-specific clinical information at the point of care (e.g. treatments, trends, outcomes)

Shared data from primary and secondary care (general practice, hospital clinics, laboratories, retinal screening services, podiatry, etc.)

Organizational information (diaries, clinics)

Guidelines and summaries of evidence (

Tayside Diabetes Handbook

)

Practice statistics, regional comparisons, audit facilities

Patient information leaflets

CVD risk calculator

Links to professional bodies and diabetes support agencies

Information about the DARTS project (development information, news, personnel, local research)

Multiple search facilities, etc. ...

Restricted access is available via http://www.diabetes-healthnet.ac.uk/

Page 3: DARTS 2000 online diabetes management system: formative evaluation in clinical practice

Formative evaluation of DARTS 2000

©

2003 Blackwell Publishing Ltd,

Journal of Evaluation in Clinical Practice

,

9

, 4, 391–400

393

functional capability (e.g. robustness, databases) andtheir impact on clinical or efficiency outcomes (as incontrolled trials), but also as part of the design pro-cess itself. Such formative evaluation studies involveusers in the testing process and address questionssuch as whether the system meets the requirementsof users, how easy it is to use, whether it has unanti-cipated influences on organizational processes orworking practices and what are the local barriers toits adoption. User involvement at this early stagecan help to identify unexpected technical problems,suggest valuable additions or modifications andhelp plan effective implementation strategies, thusincreasing the chances of successful system adoption.As discussed previously in the

Journal of Evaluationin Clinical Practice

, and elsewhere, the value of suchan exercise is enhanced by the inclusion of both‘subjectivist’ (primarily qualitative) and ‘objectivist’(primarily quantitative) evaluation methods (Bürkle

et al

. 2001; Friedman & Wyatt 1997).The study reported here aimed to assess user

responses to the prototype DARTS 2000 system inclinical practice as a means of refining it to reflect theneeds and preferences of the target audience moreclosely. Assessment of the technical functionality androbustness of the system was carried out separatelyby the system developers, although both teamsworked closely with one another. The specific aims ofthis evaluation were as follows:

to gather feedback on the usability and utility ofthe DARTS 2000 prototype in clinical practice, soas to inform system refinements prior to imple-mentation across Tayside;

to monitor usage patterns within and across pro-fessional groups to test/develop utility models, and

to investigate organizational and personal barri-ers and facilitators to system use, so as to informthe development of a tailored implementationplan.

Methods

Design

A mixed-method approach, incorporating cross-sectional survey, remote user observation over time,semi-structured key-informant interviews and moni-toring of opportunistic electronic feedback, was used.

Context and sample

One Scottish local health care cooperative(LHCC), comprising five general practices servingapproximately 600 diabetic patients, was sampled.Thirty-eight staff members were eligible to partici-pate, including 18 GPs, eight practice nurses (PNs)and 12 administrators (As five practice managers,five clerical staff, two practice auditors). Allreceived the questionnaire and had their site usagemonitored remotely. Nine key respondents partici-pated in the semi-structured interviews. In order togain a spread of views, interview participants wereselected by professional group (three GPs, threenurses, three administrators), and usage frequency,assessed via remote observation logs (high/mid/non-user). Electronic feedback was gatheredopportunistically.

Tools

Questionnaire

This consisted of 43 mixed-format items, arrangedin five sections: (a) self/role, computer/web access,literacy, use; (b) knowledge and attitudes regardingDARTS 2000, first encounters; (c) three five-pointrating scales assessing usability, usefulness and pre-sentation for the site as a whole and 14 individualfeatures (plus space for free responses); (d) perceivedbarriers and facilitators to implementation (open),and (e) consent to remote monitoring (single item).

Interview schedule

The semi-structured protocol covered experiencesand perceptions of the website itself (e.g. features,usability) and of the current implementation process(positive and negative aspects); personal usage inpractice (including page preferences and integrationinto work routines), and perceived barriers and facil-itators to regional implementation.

Remote monitoring

Password-only access allowed monitoring of usageby known personnel.

Electronic feedback

‘Comments’ buttons appearing on each screen allowede-mailing of queries and suggestions to developers.

Page 4: DARTS 2000 online diabetes management system: formative evaluation in clinical practice

C. Pagliari

et al.

394

©

2003 Blackwell Publishing Ltd,

Journal of Evaluation in Clinical Practice

,

9

, 4, 391–400

Procedure

Recruitment

Participants were initially recruited by the LHCC,which had volunteered their cooperation prior to thestart of the project. In addition, all stakeholders wereinvited to an area meeting, during which members ofthe evaluation team presented the study’s aims andobjectives, supported by existing outreach workersknown to the delegates. None of the potential sub-jects actively declined to participate.

Phase 1: dummy data released

The dummy site was released for initial consulta-tion, whereupon all eligible staff received pass-words, an information sheet encouraging them toaccess the site and contact details for technicaladvice/help. After a trial period of 8 weeks, thequestionnaire was distributed. Feedback of resultsat this stage facilitated the first iteration of the sys-tem and helped to inform the rollout procedures forphase 2.

Phase 2: live data released

Following system changes made on the basis ofphase 1 feedback, live clinical data were madeavailable and remote tracking of usage was initi-ated. (Access to patient-specific data was controlledcarefully and participants were only able to viewinformation relevant to their own patients orpractice.) Observation continued for a period of3 months in order to minimize the influence of nov-elty effects and to enable a realistic pattern ofusage to establish. Observational data were used toselect key informants for interview, as indicatedpreviously. One-to-one interviews, lasting approxi-mately 40 min each, were conducted at theinformant’s place of work and were facilitated byaccess to the website and printouts summarizingthe individual’s site usage patterns. Electronic feed-back submitted opportunistically via the DARTS2000 website was also collected over the 3 monthperiod.

The site was further refined as a result of thisphase, which also informed the regional implemen-tation strategy. Although the process of feedback anditeration was, of necessity, continuous and dynamic, it

was conducted in a systematic manner in order toenable tracking and auditing. At each stage, resultswere translated into key messages and action pointsfor the development and implementation teams;these were methodically logged, prioritized andlinked to plans for follow-up (e.g. target date, dateactioned, personnel, etc.).

Results

Questionnaire (following exposure to dummy site)

The questionnaire achieved a 65% response ratewith one follow-up letter, despite its length and theintervening holidays. Respondents represented allprofessional groups and varying levels of interest indiabetes (e.g. some were responsible for practice dia-betes clinics). Due to the length of the questionnaire,only key results are reported here.

Computer and web familiarity/usage

All respondents reported that they had access to acomputer at work. Sixty-two per cent reportedaccessing the Internet at work on at least a weeklybasis; 16% reported monthly access, and theremainder reported infrequent use (less thanmonthly or not at all). Free responses indicatedseveral uses of the Internet at work, including pro-fessional development (guidelines, journal access),other clinical guidance (e.g. travel vaccination infor-mation, patient information leaflets) and leisure.Seventy-five per cent of respondents reported hav-ing web access at home, with a variety of uses,including e-mail, leisure and personal finance. Allbut one respondent indicated a positive attitudetowards computers (rated 3–5).

DARTS as a whole

All respondents had heard of the website and 96%reported having visited it. Fifty-eight per cent hadheard of it from a project worker and 37% from acolleague. Ninety-six per cent expressed a positiveattitude towards the website (all ratings

<

3). Sixty-sixper cent had been able to use it without assistance.All but one of those who required assistance hadreceived it from a colleague. Thirty-three per centexpressed the wish for further training to be able touse the site (spread evenly across professional

Page 5: DARTS 2000 online diabetes management system: formative evaluation in clinical practice

Formative evaluation of DARTS 2000

©

2003 Blackwell Publishing Ltd,

Journal of Evaluation in Clinical Practice

,

9

, 4, 391–400

395

groups). Eighty-seven per cent thought they woulduse the website ‘often’ or ‘very often’ after it wentlive. All professional groups thought their nurse col-leagues would use the website more often than GPs;there was a spread of opinion about how often itwould be used by administrative staff. Few respon-dents had a good idea of how data would be enteredon the system (free response).

Features of the website

The site as a whole and each of the 14 individualsubsites were rated on three five-point scales relat-ing to usefulness (utility) ease of use (usability) andpresentation (interface). Not all individual featureswere rated by all respondents. For this reason,these results are presented in general terms. Theoverall website was rated useful, easy to use andwell presented by all respondents (all scores

<

4).Most individual features were rated favourably onall dimensions by all respondents, with the major-ity receiving scores in excess of three. Featuresrated less useful were those containing generalinformation about the DARTS project itself(‘News’ and ‘Research’). Helpful free-text com-ments were provided for all features, although notall respondents completed all of these sections.These responses were content analysed beforebeing fed back to system developers. The four maincontent areas follow, along with representativeexamples of free text:

factual content, e.g. ‘specialist nurses now at[named] health centre’ (PN);

features, e.g. ‘a long-term view of where the eyevan will be would be useful in our area’ (A);

usability, e.g. ‘haven’t worked out how to downloadand transfer patient leaflets to two-sided A4’ (GP);

format, e.g. ‘graphics/colour would enhance [leaf-lets] for patients’ (PN);

general, e.g. ‘this will be useful’ (GP).

Perceived barriers and facilitators to implementation (free response)

Twenty-two of the 26 respondents completed thissection.

Barriers

Twenty-two comments characterized bythree main themes, spread evenly across the profes-sional groups: lack of time; access to equipment and

training; fear of computers. Anxiety over data valid-ity was also raised.

Facilitators

Twenty-one comments, three mainthemes: training (ideally one-to-one); the systemitself (demonstrates that it meets users’ needs forrapid and timely information); experience of users(growing familiarity). Integration into process of care(making the system a standard part of the referralsand communication process) was also raised.

Consent to remote monitoring

All respondents agreed to be observed remotely.

Remote observation

Observational data were recorded for user, profes-sional group and subfeature (broken down furtherfrom the questionnaire) to reveal variations in gen-eral and specific patterns of access. Altogether,there were 3900 ‘hits’ by study participants in the3 month observation period. Administrators werethe most frequent users with 1390 hits and 16 fea-tures accessed. Of these, the most frequent wereform submission (data entry, 191), practice adminis-tration (password issue etc., 206), patient mainscreen (patient records, 239) and practice overview(practice diabetes statistics, 310). GPs were the nextmost frequent users with 1263 hits and 19 subsitesaccessed, those used most frequently being patientgraph (bio-trends, 87), practice overview (132),patient diary (appointments and encounters, 214)and patient main screen (440). Nurses had slightlyfewer hits than GPs (1248) and accessed 18 features,most frequently patient searches (114), patient mainscreen (126), practice overview (326) and form sub-mission (419).

User interviews

Of the nine respondents originally targeted (high/mid/non-using GPs/practice nurses/administrators),only two administrators were available for interview(one high user, one non-user) and an additional mid-using GP volunteered to participate (

n

=

9).Interview responses were analysed by theme and

content to uncover the primary issues. Overall, twomain types of feedback were received via this route:(a) comments on the value of the resource itself and

Page 6: DARTS 2000 online diabetes management system: formative evaluation in clinical practice

C. Pagliari

et al.

396

©

2003 Blackwell Publishing Ltd,

Journal of Evaluation in Clinical Practice

,

9

, 4, 391–400

the place it takes in the care of people with diabetes,and (b) particular changes that users would like tosee made to the system. Specific feedback relating tousability, utility and reliability was delivered to devel-opers on a feature-by-feature basis to facilitate sys-tem iterations. Detailed feedback on experiences ofand ideas for implementation was similarly deliveredto the implementation team. For brevity, only broadresults are shown, categorized by theme (H = highuser, M = mid user, N = non-user).

Attitudes

These were generally very welcoming: ‘it will be aboon’ (A-L).

Perceived advantages /expectations

‘Improved communication between primary andsecondary care’(PN-H) was cited by most respon-dents. It was generally felt that the systemwould improve both the quality and speed ofinformation sharing. ‘Reduced duplication ofinvestigations and tests’ (GP-H) was oneperceived benefit of this, as, ultimately, was‘improved care’ (GP-H).

‘Improved ability to do audit’ (PN-H). A numberof respondents cited the ability of the system tofacilitate audit and governance by allowingreports to be produced rapidly.

Usability

No major problems were reported. ‘I think it’squite easy, and I’m not a computer person’ (A-H).Most suggestions focused on the need for clarifica-tion, for example, ‘better indication of the differ-ence between clinical review screen, data entryscreen and patient summary sheet would be good’(GP-M1). A number of specific technical problemswith data fields were identified, such as, ‘it doesn’tseem to be possible to overwrite a classification oftype 2 with type 1’ (A-H). Other technical usabil-ity issues related more to training needs: ‘pass-words are a problem. They keep changing. I forgetthem’ (GP-M1).

Interface

Both negative and positive comments were received:‘yellow bar takes up valuable screen space’ (GP-H);‘I like the little picture of our practice’ (A-H).

Timeliness

A number of respondents observed that ‘informationfrom secondary care isn’t always available quicklyenough’ (PN-H). In all cases, this was attributed toless sophisticated administrative procedures at thesecondary care end.

Content

These comments mainly concerned the appropriate-ness of data fields, for instance, ‘lack of medicationinformation on summary sheet’ (GP-M1).

Common uses

‘Putting in clinical readings when the patientreturns for a review’ (GP-M1). All respondentsreported inputting some clinical data directly,although practices varied in the balance of respon-sibility for this between clinical and clerical staff.

‘Looking up individual patient data while they’rehere . . . I get an idea of trends, etc.’ (GP-M2).Most clinicians reported using patient screensduring the consultation.

‘To get leaflets when the patient is here . . . toexplain what type 2 diabetes is, etc.’ (PN-M). Thiswas reported by nurses.

Incorporation into workflow/where used

Respondents varied in their patterns of use. Mostreported using the system during the consultation topull patient records (as detailed earlier) or for otherfeatures (e.g. ‘for the eye van’, PN-H), while onethought this was too time-consuming. Most cliniciansused the data recording features between consulta-tions and the educational and research facilities atother times – ‘I generally use the handbook in myown time’ (GP-M1).

Professional–patient communication

Most GPs and nurses reported sharing screens withpatients and receiving positive reactions. ‘They’revery impressed . . . the patients think I’m very up-to-date’ (GP-M2). ‘The risk calculator is popular withpatients’ (PN-H).

Barriers to implementation

Technical

Compatibility with existing systems,for instance ‘double entry [with GPASS] is a

Page 7: DARTS 2000 online diabetes management system: formative evaluation in clinical practice

Formative evaluation of DARTS 2000

©

2003 Blackwell Publishing Ltd,

Journal of Evaluation in Clinical Practice

,

9

, 4, 391–400

397

problem’ (GP-M2); usability issues, such as ‘pass-words didn’t work’ (A-L), ‘worrying warningmessages’(PN-M).

Organizational

‘Time pressure’ (PN-H); per-ceived remit: ‘people using it are those with aninterest in diabetes’ (PN-L).

• Human ‘Computer phobias’ (GP-M1); ‘habits’(GP-H); ‘negative attitudes’ (PN-M).

Facilitators to implementation

• IT and outreach support ‘Prompt e-mail reply’(GP-H). ‘[Project workers] have been verygood . . . it’s important to be pushed a little’ (GP-M2).

• Technical ‘Ease of use, fast system, up-to-dateinformation’ (GP-M2).

• Evidence of benefits ‘Once the GPs see its use-fulness, they’ll use it more’ (PN-H).

• Marketing ‘There needs to be some sort of agrassroots promotion’ (PN-H).

• Human Habits/experience: ‘once you know howto use it you can fit it in’ (GP-M2).

• Training ‘One-to-one training’ (PN-L).Although the non-users interviewed were less

able to provide informative observations on the fea-tures of the system or their integration into workprocesses, it was important to examine their barriersto use. In the case of the administrator, the primaryobstacle had been technical – problems accessingthe live system – whereas her attitudes and expecta-tions were highly welcoming and positive. In thecase of the non-using GP, the barrier was primarilyattitudinal, namely the belief that computers were atime-consuming interference in the consultation andcould not be relied upon; also that it was not clearwhose responsibility it was to use the system(respondent did not consider himself a diabetes spe-cialist). Triangulation with comments from thisrespondent’s colleagues indicated that he was closeto retirement and therefore may have been unwill-ing to change practice. The non-using nurse thoughtthe system was a good idea but had not yet accessedit, partly because she did not feel it was within herremit as a non-diabetes specialist. She cited fear ofthe unknown as a barrier to use and identified one-to-one training as a means of changing herprofessional practice.

Electronic feedback

Forty-three comments relating to particular screenswere received from pilot participants during the3 month study period. These were content analysedinto the following categories.

Usability

Nineteen comments relating to the ease of use of spe-cific screens/features and suggestions for improve-ment: for example, ‘the little tables with all the dataon are really good but are easily missed – could wemake them a little more obvious?’ (GP).

Utility

Ten comments relating to the usefulness of the fea-tures and how this might be improved: for example,‘we need a trend indicator in the visual acuity field asa drop-in acuity is what’s being looked for’ (GP).

Data validity

Eight comments relating to the accuracy or timeli-ness of clinical data: for example, ‘shown as tablettreatment – is on orl/ins combined therapy as pervalidation data’ (GP).

Technical queries

Five technical queries relating to usage of the site: forexample, ‘can’t get into individual patient screen.Says invalid date entered but it is correct’ (PN).

Praise

One comment: ‘great to find so many newly diag-nosed patients – most of whom go through the hos-pital’ (GP).

Discussion

As a result of the feedback gathered during this exer-cise, a number of improvements to the prototypeDARTS 2000 web resource were made, relating toadditional features and enhanced accuracy, usabilityand interface design. The process of feedback anditeration was, to a large extent, continuous anddynamic, and therefore may not be presented instraightforward cause-and-effect or quantitativeterms. Likewise, the results of this mixed-methodstudy are too numerous to report in their entirety.

Page 8: DARTS 2000 online diabetes management system: formative evaluation in clinical practice

C. Pagliari et al.

398 © 2003 Blackwell Publishing Ltd, Journal of Evaluation in Clinical Practice, 9, 4, 391–400

Nevertheless, the methods and results described heremay provide an indication of how user-informedevaluation in the clinical context may be conductedand may add value to new developments. Theevaluation researchers also worked closely withimplementation workers, and the findings wereincorporated into a comprehensive and successfulrollout plan.

The challenges of applied health informatics eval-uation research have been recognized by numerouscommentators. These include: the need to judge theresource from multiple perspectives (e.g. usability,functionality, utility, acceptability, effectiveness); thecomplexity of the system to be evaluated (which mayhave many different components) and/or its chang-ing nature (as in development projects involving con-tinuous cycles of feedback and iteration); shiftingexternal influences (e.g. other initiatives may bebeing implemented in parallel), and the inapplicabil-ity of controlled scientific methods (such projectsstudy real users in real-world settings and preciseoutcome measures may not be available) (e.g. Steadet al. 1994; Friedman & Wyatt 1997). All of thesechallenges applied in the case of this study, which wasconducted under the pressure of a planned rollouttimetable that was initiated during the process. For-tunately, web-based systems can be modified cen-trally; hence it was possible to continue iterationsafter the rollout had started. Furthermore, the sys-tematic process of feedback, prioritization andfollow-up described previously enabled the majorityof key changes to be made at an early stage.

The value of assessing both technical and humanaspects of system use in formative studies is clear.While developing good, user-informed technologyis one key to successful implementation, ‘peopleand organizational issues’ influencing professionalbehaviour change are equally, or perhaps even more,important and their identification at an early stage isessential to allow appropriate interventions to be putin place (Lorenzi & Riley 2000; Kaplan et al. 2001;Kaplan & Shaw 2002). Indeed, it has been estimatedthat up to 50% of technically sound systems havefoundered on staff revolt, boycott, sabotage or dis-satisfaction (Dowling 1980).

Tailored implementation strategies, which takeaccount of the evidence on the effectiveness of alter-native approaches to professional behaviour change

and link interventions to local barriers and facilita-tors, have been found to be more successful thannon-tailored strategies, particularly when theyinvolve multiple methods (NHS Centre for Reviewsand Dissemination 1999). While, in this case, it waspossible to draft an evidence-based implementationplan at an early stage (involving educational out-reach, professional meetings, reminder systems, auditand feedback and opinion leader advocacy), theresults of the questionnaire and interview studiesenabled this to be tailored to the local context. Notsurprisingly, lack of time (both for learning to oper-ate the new system and using it in practice) was oneof the main barriers cited; however, a number ofrespondents observed that this becomes less impor-tant once using the system is integrated into daily dia-betes patient management routines. Similarly, desirefor one-to-one training on-site and evidence of ben-efit following usage were identified as key facilitators.This helped to highlight key messages for rollout andenabled more responsive training.

As mentioned in the Introduction, the develop-ment of the initial prototype had been basedlargely on the availability of information and theideas of the development team and steering group.No formal models of the place of the system in theorganization (who would be using which parts, howthey would be using it, where it would be used)existed. For example, early discussions with thedevelopment team indicated an assumption that itwould mainly be used outside the consultation fordata management and educational purposes. Incontrast to this, all the clinical users interviewedreported actively using the system within the con-sultation as a means of obtaining immediate infor-mation at the point of care. Furthermore, manyscreens were shared with patients in order to facili-tate discussions about treatments, test results,trends and lifestyle changes. Two models of datamanagement also emerged, with clinicians input-ting their own patients’ clinical data in some prac-tices and devolving this task to a dedicated memberof the administrative staff in others (via dictationsand handwritten notes). This was partly a functionof administrative staffing levels and partly a func-tion of the computer literacy of clinicians. Thesequalitative observations were triangulated withquantitative remote observation data to produce a

Page 9: DARTS 2000 online diabetes management system: formative evaluation in clinical practice

Formative evaluation of DARTS 2000

© 2003 Blackwell Publishing Ltd, Journal of Evaluation in Clinical Practice, 9, 4, 391–400 399

more comprehensive view of professional usagepatterns. Such information helped the implementa-tion team to better tailor their training by, forexample, presenting optimal and alternative mod-els of system usage and task sharing/distributionand designing training exercises to suit profes-sional roles (focusing on areas known to be of par-ticular use by particular types of staff). Likewise,interviews indicated that clinicians would likeaccess to the Tayside Diabetes Handbook at homefor the purpose of professional development. Thishad not originally been made available on the pub-licly accessible parts of the site due to concernsover access by patients. However, while some inter-viewees did express reservations about patientaccess (specifically fears over patients being moreinformed than themselves), the majority did notperceive this as a problem and, as a result of thisfeedback, the handbook is to be made available viathe World Wide Web. With regard to the period ofremote observation, while 3 months may be consid-ered lengthy, the research team would not haveobtained as realistic a picture of system usage inpractice had it been shorter. In the first month, forexample, GPs were the highest users and adminis-trators the lowest, probably reflecting the curiosityof the former and the fact that the latter had notyet fully taken on board the data processing tasksassociated with the system. However, the usagepatterns found in the full observation period closelyresemble those that have persisted following thestudy.

DARTS 2000 has now been rolled out successfullyacross the Tayside region, with very good uptake andfew technical problems (73 of 74 practices areactively using the system). Anecdotal accounts indi-cate that, in addition to its other benefits, userinvolvement in the process of development created ahigh level of stakeholder ownership, which has facil-itated the implementation process. The site nowforms the hub of the local managed clinical network.A number of new features have been added sincethe conclusion of the formal evaluation study; how-ever, observation of usage patterns is ongoing and allusers are actively encouraged to submit electronicfeedback as a means of monitoring usability andfunctionality issues. DARTS 2000 has recently beenembraced as the national standard and is to be used

as the basis of a national clinical management system(SCI-DC). It remains to be seen how the system willbe taken up in less technologically advanced areas.This, and the impact of the system on clinical andefficiency outcomes, will be assessed in a futurestudy.

Acknowledgements

Thanks to the staff of Arbroath & Friockheim LHCCfor their participation in this research, and to mem-bers of the DARTS Steering Group. DARTS 2000was funded by the Scottish Executive.

References

Bürkle T., Ammenwerth E., Prokosch H.-U. & Dudeck J.(2001) Evaluation of clinical information systems. Whatcan be evaluated and what cannot? Journal of Evalua-tion in Clinical Practice 7 (4), 373–385.

Dowling A.F. (1980) Do hospital staff interfere with com-puter system implementation? Health Care ManagementReview 5, 23–32.

Friedman C.P. & Wyatt J.C. (1997) Evaluation Methods inMedical Informatics. Springer-Verlag, New York.

Heathfield H., Pitty D. & Hanka R. (1998) Evaluatinginformation technology in health care: barriers and chal-lenges. British Medical Journal 316, 1959–1961.

Kaplan B., Brennan P.F., Dowling A.F., Friedman C. &Peel V. (2001) White Paper. Toward an informaticsresearch agenda: key people and organizational issues.Journal of the American Medical Informatics Association8 (3), 235–241.

Kaplan B. & Shaw N.T. (2002) People, organizational andsocia issues: evaluation as an exemplar. In Yearbookof Medical Informatics 2002 (eds R. Haux & C.Kulikowski), pp. 91–102. Schattauer, Stuttgart.

Lorenzi N.M. & Riley R.P. (2000) Managing change: anoverview. Journal of the American Medical InformaticsAssociation 7 (2), 116–124.

Mitchell A. & Sullivan F. (2001) A descriptive feast but anevaluative famine: systematic review of published arti-cles on primary care computing during 1980–97. BritishMedical Journal 322, 279–282.

Morris A.D., Boyle D.I.R., MacAlpine R., Emslie-SmithA., Jung R.T., Newton R.W. & MacDonald T. (1997)The diabetes audit and research in Tayside, Scotland(DARTS) study: electronic record linkage to create adiabetes register. British Medical Journal 315, 524–528.

Page 10: DARTS 2000 online diabetes management system: formative evaluation in clinical practice

C. Pagliari et al.

400 © 2003 Blackwell Publishing Ltd, Journal of Evaluation in Clinical Practice, 9, 4, 391–400

NHS Centre for Reviews and Dissemination (NHSCRD)(1999) Getting evidence into practice. Effective Health-care Bulletin 5, 1.

Scottish Executive Health Department (SEHD) (2000)Our National Health: a Plan for Action, a Plan forChange. The Stationery Office, Edinburgh.

Scottish Office Department of Health (SODH) (1997)Designed to Care: Renewing the National Health Servicein Scotland. The Stationery Office, Edinburgh.

Scottish Office Department of Health (SODH) (1998)Acute Services Review Report. The Stationery Office,Edinburgh.

Stead W., Haynes B., Fuller S., Friedman C.P., Travis L.E.,Beck R., Fenichel C.H., Chandrasekaran B., BuchananB.G., Abola E.E., Sievert M.C., Gardner R.M., Mes-serle J., Jaffe C.C., Pearson W.R. & Abarbanel R.M.(1994) White Paper. Designing medical informaticsresearch and library resource projects to increase whatis learned. Journal of the American Medical InformaticsAssociation 1 (1), 28–33.

Sullivan F., Pagliari H.C. & Mitchell E. (2002) HealthInformatics. Royal College of General Practitioners,London.