MedBiquitous Annual Conference 2011 May 11, 2011 Baltimore, Maryland

17
Formative Evaluation of the Implementation of the Medical Education Metrics Standard for Continuing Education MedBiquitous Annual Conference 2011 May 11, 2011 Baltimore, Maryland

description

Formative Evaluation of the Implementation of the Medical Education Metrics Standard for Continuing Education. MedBiquitous Annual Conference 2011 May 11, 2011 Baltimore, Maryland. Presenters and Disclosures. Francis Kwakwa , MA - PowerPoint PPT Presentation

Transcript of MedBiquitous Annual Conference 2011 May 11, 2011 Baltimore, Maryland

Formative Evaluation of the Implementation of the Medical Education Metrics Standard for

Continuing Education

MedBiquitous Annual Conference 2011

May 11, 2011 Baltimore, Maryland

Presenters and Disclosures

• Francis Kwakwa, MAAssistant Director, Data Management , Radiological Society of North AmericaDoes not have an interest in selling anything to CME professionals

• Andrew Rabin, BSChief Technology Officer, CECity Does have an interest in selling a program or service to CME professionals

• Sean Hayes, PsyD Vice President of AXDEV GroupDoes have an interest in selling a program or service to CME professionals

• Valerie Smothers, MA Deputy Director, MedBiquitousDoes not have an interest in selling anything to CME professionals

2

Objectives of Today’s Session

• Increase knowledge about use of Medical Education Metrics Standard (MEMS) in CME

• Share experience of using MEMS as part of Program and Activity Reporting System

• Present formative findings collected regarding the experience of those who have implemented MEMS

• Offer case study findings as to how to assess implementation of new systems

3

MedBiquitous Metrics Working Group

Mission

To develop XML standards & Web services

requirements and descriptions for the exchange

of aggregate evaluation data and other key

metrics for health professions education.

4

MEMS: Medical Education Metrics

• Technology standard for core evaluation data

• Users– Educators want best

practices, ability to compare– Funders want to measure

reach and efficacy– Accreditors want to

measure success of activity and provider

MEMS Data

• Activity Description– What’s being evaluated

• Participant Activity Evaluation– What did participants think

• Participation Metrics– How many people participated

• Learner Demographics

Program and Activity Reporting System (PARS)

7

• Manual data entry

• Comma delimited upload

• XML upload

Formative Evaluation

• Provides immediate feedback to program participants

• Fosters reflection and self-regulation

• Provides feedback about the program itself to designers and sponsors

• Oriented to improving the focus of the program

• Based upon a dynamic model

Relationship Between Formative and Summative Evaluation

Across Life of an Innovation

Rel

a tiv

e E

mp

has

is

Program Life

Summative Evaluation

Formative Evaluation

Worthen 1987

Focus Group

• MEMS implementers using XML in association with PARS

• 4 organizations represented

• Measured – Success and satisfaction– Resources for implementation– Characteristics of implementers– Decision factors

10

Results

• Implementers felt satisfied, but limited (not many customers implementing)

• Limited effort and resources required; change management important

• Implementers all fluent in XML • ACCME adoption key; implementers

seeking to better support customers, show competitive advantage, make links between different types of data

Case Study

• Implemented MEMS for PARS upload

• Experience implementing MedBiquitous standards

• Integrated new data into user interface and workflow

• Provided user training

Lessons Learned

16

• Standards implementation can bring about efficiencies when integrated into workflow and properly supported

• Validation tools needed

• Web services needed to realize full benefits, one click data transfer

Contact Us

17

[email protected]

• Valerie will forward questions to the working group