studiu3

13
This article was downloaded by: [Central U Library of Bucharest] On: 12 June 2014, At: 03:26 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Gerontology & Geriatrics Education Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/wgge20 Course Evaluation and Assessment: Examples of a Learner-Centered Approach Eva M. Schmitt PhD a b , Anne C. Hu MPH c & Peter S. Bachrach PhD c a Division of Geriatrics , University of California , San Francisco, CA b Institute on Aging Research Center , San Francisco, CA c Division of Geriatrics , David Geffen School of Medicine at UCLA , Los Angeles, CA Published online: 11 Oct 2008. To cite this article: Eva M. Schmitt PhD , Anne C. Hu MPH & Peter S. Bachrach PhD (2008) Course Evaluation and Assessment: Examples of a Learner-Centered Approach, Gerontology & Geriatrics Education, 29:3, 290-300, DOI: 10.1080/02701960802359524 To link to this article: http://dx.doi.org/10.1080/02701960802359524 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views

description

Articol

Transcript of studiu3

Page 1: studiu3

This article was downloaded by: [Central U Library of Bucharest]On: 12 June 2014, At: 03:26Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH,UK

Gerontology & GeriatricsEducationPublication details, including instructions forauthors and subscription information:http://www.tandfonline.com/loi/wgge20

Course Evaluation andAssessment: Examples of aLearner-Centered ApproachEva M. Schmitt PhD a b , Anne C. Hu MPH c & Peter S.Bachrach PhD ca Division of Geriatrics , University of California ,San Francisco, CAb Institute on Aging Research Center , San Francisco,CAc Division of Geriatrics , David Geffen School ofMedicine at UCLA , Los Angeles, CAPublished online: 11 Oct 2008.

To cite this article: Eva M. Schmitt PhD , Anne C. Hu MPH & Peter S. Bachrach PhD(2008) Course Evaluation and Assessment: Examples of a Learner-Centered Approach,Gerontology & Geriatrics Education, 29:3, 290-300, DOI: 10.1080/02701960802359524

To link to this article: http://dx.doi.org/10.1080/02701960802359524

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all theinformation (the “Content”) contained in the publications on our platform.However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness,or suitability for any purpose of the Content. Any opinions and views

Page 2: studiu3

expressed in this publication are the opinions and views of the authors, andare not the views of or endorsed by Taylor & Francis. The accuracy of theContent should not be relied upon and should be independently verified withprimary sources of information. Taylor and Francis shall not be liable for anylosses, actions, claims, proceedings, demands, costs, expenses, damages,and other liabilities whatsoever or howsoever caused arising directly orindirectly in connection with, in relation to or arising out of the use of theContent.

This article may be used for research, teaching, and private study purposes.Any substantial or systematic reproduction, redistribution, reselling, loan,sub-licensing, systematic supply, or distribution in any form to anyone isexpressly forbidden. Terms & Conditions of access and use can be found athttp://www.tandfonline.com/page/terms-and-conditions

Dow

nloa

ded

by [

Cen

tral

U L

ibra

ry o

f B

ucha

rest

] at

03:

26 1

2 Ju

ne 2

014

Page 3: studiu3

Gerontology & Geriatrics Education, Vol. 29(3) 2008Available online at http://www.haworthpress.com© 2008 by The Haworth Press. All rights reserved.

290 doi:10.1080/02701960802359524

WGGE0270-19601545-3847Gerontology & Geriatrics Education, Vol. 29, No. 3, August 2008: pp. 1–16Gerontology & Geriatrics Education

Course Evaluation and Assessment: Examples of a Learner-Centered Approach

Schmitt, Hu, and BachrachGerontology & Geriatrics Education Eva M. Schmitt, PhDAnne C. Hu, MPH

Peter S. Bachrach, PhD

ABSTRACT. Teaching in higher education increasingly requires greateraccountability, the utilization of contemporary learner-focused teachingmodels, and transparent grading methods for nonstandardized learningproducts. This article describes learner-centered evaluation and assessmentstrategies and illustrates how these approaches emphasize learners’ respon-sibility for their own learning, foster students’ commitment to learning andprovide useful information for continuous curriculum improvement. Inaddition, the article discusses the components of learner-centered assess-ment models including the course assessment and enhancement model, thePersonal Action Plan, and the Gedanken Experiment. Further, a rubric ispresented as a tool for systematic and transparent grading of learner-centeredassessment products. The need for further validation of these strategies isbeing discussed.

KEYWORDS. learner centered, evaluation, assessment, rubric

Eva M. Schmitt, PhD, Program Evaluator, Division of Geriatrics, University ofCalifornia, San Francisco, San Francisco, CA; and Associate Director for ProgramEvaluations, Institute on Aging Research Center, San Francisco, CA.

Anne C. Hu, MPH, Program Coordinator, Division of Geriatrics, David GeffenSchool of Medicine at UCLA, Los Angeles, CA.

Peter S. Bachrach, PhD, Program Evaluator, Division of Geriatrics, DavidGeffen School of Medicine at UCLA, Los Angeles, CA.

Address correspondence to: Eva M. Schmitt, PhD, Associate Director–Program Evaluations, Institute on Aging Research Center, 3330 Geary Blvd., SanFrancisco, CA 94118–334 (E-mail: [email protected]).

Dow

nloa

ded

by [

Cen

tral

U L

ibra

ry o

f B

ucha

rest

] at

03:

26 1

2 Ju

ne 2

014

Page 4: studiu3

Schmitt, Hu, and Bachrach 291

INTRODUCTION

Evaluation and learner-centered strategies are currently major themesin the discourse around higher education. While the growing focus onaccountability and justification places evaluation activities in an increas-ingly important role, the paradigm shift away from traditional content andteacher-centered models emphasizes learner-centered teaching andassessment approaches (Huba & Freed, 2000). However, learner-centeredevaluation activities, which provide learners the opportunity to formativelyshape the conduct of an ongoing educational program, have received littleattention (National Commission on Accountability in Higher Education,2005). While evaluation concentrates on the appropriateness and quality of aprogram or curriculum, assessments focus on individual learners’ achieve-ments. In the educational setting, however, evaluation and assessment areoften used interchangeably (Suskie, 2004). The purpose of this article is topresent and discuss the different components of learner-centered assess-ments and learner-centered evaluation strategies. To this end, we describeprinciples, provide examples, and discuss components of learner-centerededucation evaluation and assessment strategies.

PRINCIPLES OF LEARNER-CENTERED EVALUATION AND ASSESSMENT AND PREVIOUS APPLICATIONS

Program evaluation is a systematic process of utilizing data to judge agiven program (Kirkpatrick & Kirkpatrick, 2006) and may include for-mative and/or summative strategies. Summative evaluation activities areconducted at the end of a course to judge a program retrospectively. Incontrast, formative evaluation assesses a program while it takes place,providing data for timely intervention if needed (Suskie, 2004). One popu-lar model for program evaluation identifies four levels of evaluationactivities: (1) reaction: satisfaction with and perception of a program’squality; (2) learning: changes in attitudes, knowledge and skills due tothe program attendance; (3) behavior: changes in practice and the appli-cation of learning to practice that is attributable to the program; and(4) results: assessment of changes in a system or organization related toprogram participation (Kirkpatrick & Kirkpatrick, 2006). However, thismodel focuses on summative evaluation and provides little guidance forformative strategies (Bates, 2004). Summative evaluation is often conductedwith satisfaction surveys to determine improvement needs identified by

Dow

nloa

ded

by [

Cen

tral

U L

ibra

ry o

f B

ucha

rest

] at

03:

26 1

2 Ju

ne 2

014

Page 5: studiu3

292 GERONTOLOGY & GERIATRICS EDUCATION

learners (Suskie, 2004). However, the value of student feedback at theend of a course is unclear, with some claiming that ritualized evaluationactivities may be more effective in supporting teacher promotion thaninforming course improvement. In addition, student feedback solicitedonly at the end of a course deprives students of the opportunity to benefitfrom course improvements based on their own feedback (Combs, Gibson,Hays, Saly, & Wendt, 2007; Hendry, Cumming, Lyon, & Gordon, 2001).The inclusion of formative student feedback given during the course of aprogram to guide ongoing course modifications to current students’ needshas received little attention in the literature. One exception is the courseassessment and enhancement model that utilizes student feedback to tai-lor a course to current students’ needs and to improve future courses(Combs et al., 2007). At the beginning of the course students are asked torate their perceived competence in, and importance of, course-specificlearning objectives. Guided by students’ answers, teachers communicatethe importance of objectives with low importance ratings and emphasizeobjectives with low perceived competence. To analyze changes in students’perceptions due to the completion of course, students are asked at the endof the course to rate the importance of course objectives and assess per-ceived ability to complete these objectives. In concert with other strate-gies, pre- and postassessment inform course evaluation and any—ifneeded—modifications of the curriculum, course objectives, and deliverymethods. This course assessment and enhancement model utilizes stu-dent feedback and self-assessed skills data to customize teaching for cur-rent learners and to guide improvement strategies for future courses(Combs et al., 2007). Expanding upon the course assessment andenhancement model and the assessment model described by Suskie(2004), learner-centered evaluation contributes to a continuous cycle,including learner feedback and/or learner-centered assessment resultscollected before, during, and after a program to inform ongoing andsubsequent course goals, objectives and methods (Figure 1). This con-ceptualization of learner-centered evaluation adds learner-provided informa-tion to other program evaluation strategies such as the results of traditionaltests or the review of contextual factors (e.g., cost).

Traditionally, learning is assessed with standardized testing of knowledgeabsorption and fact recall to determine learners’ achievement of pre-defined learning goals. With the recognition that standardized tests mayhave value to compare student learning across sites, without providinginsight on how well students transfer what they have learned into practice,learner-centered assessment strategies, focusing on students’ knowledge

Dow

nloa

ded

by [

Cen

tral

U L

ibra

ry o

f B

ucha

rest

] at

03:

26 1

2 Ju

ne 2

014

Page 6: studiu3

Schmitt, Hu, and Bachrach 293

transfer and practice integration, have become increasingly popular (Epstein& Hundert, 2002). Although no standard definition exists, learner-centeredassessments are often described as providing guidance to address learningneeds and to evaluate a teaching program (Epstein & Hundert, 2002;Huba & Freed, 2000; Suskie, 2004; Weimer, 2002). Analogous to learner-centered psychological principles issued by the American PsychologyAssociation [APA] Board of Educational Affairs, (1997), learner-centeredassessments (1) give learners the opportunity to pursue personally relevantlearning goals and to integrate new knowledge with prior knowledge;(2) make learners responsible for their own learning; (3) encourage learn-ers to apply knowledge in novel situations, reflect on their learning, andset reasonable learning goals; (4) provide possibilities for collaborationand social interaction among learners; (5) respect individual learningstyles and preferences; and (6) are appropriate to learners’ skill level and cul-tural background (APA Board of Educational Affairs, 1997). Examples oflearner-centered assessment strategies include reflections, portfolios, Per-sonal Action Plans, and Gedanken Experiments. Reflections target meta-cognitive skills by encouraging learners to think about their own learningcontent and processes (what, how, and why). Portfolios assemble diverse,

FIGURE 1. Learner-centered evaluation of educational programs as acontinuous cycle.

A. CourseEvaluation

B. CourseGoals/Objectives/Methods

C.Learner

E.Feedback D. Assessment

F.Traditional

Test

Other factors such as:cost, attendance, and

faculty availability

H. Learner CenteredEvaluation G. Learner-

CenteredAssessment

Dow

nloa

ded

by [

Cen

tral

U L

ibra

ry o

f B

ucha

rest

] at

03:

26 1

2 Ju

ne 2

014

Page 7: studiu3

294 GERONTOLOGY & GERIATRICS EDUCATION

learner-chosen evidence of learning and provide a holistic picture oflearners’ skills and progress (Suskie, 2004). Personal Action Plans andGedanken Experiments are tools to facilitate reviews of learners’ personalagendas and to provide support on how to achieve these goals. We describeexamples of a Personal Action Plan and a Gedanken Experiment below.

While standardized tests are seen as an objective way to rate learning,questions of consistency, reliability, and objectivity may arise if learner-centered assessments are graded. Rubrics, a type of scoring guide, areincreasingly used as a systematic and more transparent approach to gradelearner-centered assessment products. Describing characteristics thatcorrespond to a judgment of performance, rubrics are, when carefullyconstructed, reliable and valid measurements for somewhat subjectiveassessments (National Academies, 2007). Rubrics not only ensure consis-tency of grading student assignments, but also help learners better under-stand teachers’ expectations of a given assignment. Using predeterminedstandardized criteria for a defined program task (e.g., reflective piece,portfolio), a rubric can be a useful tool before, during, and after a learningopportunity or program implementation. For example, prior to learning,rubrics identify expectations for a given assignment, though during andafter a program, they define criteria used as part of the student assessmentand provide formative and summative course evaluation data (Huba &Freed, 2000; National Academies, 2007; Suskie, 2004). We describe theevaluation of a geriatric medicine leadership training program that incor-porates a rubric to assess participants’ actual behavior change below. Fur-ther information on learner-centered assessment and grading strategies canbe found, for example, in Epstein and Hundert (2002), Huba and Freed(2000), Suskie (2004), and the Accreditation Council for Graduate MedicalEducation (2000).

EXAMPLES OF TWO LEARNER-CENTERED ASSESSMENT APPROACHES

Personal Action Plan

Personal Action Plans are widely used in chronic care management toencourage patients to set their own health goals as well as to take an activerole in their care (Barlow, Wright, & Sheasby 2002). Similarly, in educa-tional settings, a Personal Action Plan encourages learners’ active partici-pation and guides the transfer of knowledge gains to a practical setting.

Dow

nloa

ded

by [

Cen

tral

U L

ibra

ry o

f B

ucha

rest

] at

03:

26 1

2 Ju

ne 2

014

Page 8: studiu3

Schmitt, Hu, and Bachrach 295

The planning and curriculum committee for the Donald W. ReynoldsFaculty Development to Advance Geriatric Education (FD-AGE) MiniFellowship Program at UCLA utilized Personal Actions Plans to supportand assess learning in participants (Table 1). Designed to improve partici-pants’ teaching skills, participants are required to develop a teachingproduct using one of the teaching techniques taught during the course.Personal Action Plans review each participant’s personal agenda for geri-atrics teaching, define participants’ teaching goals, provide insight onpractical strategies for achieving teaching goals, and determine criteria formeeting teaching goals. Participants are encouraged to formulate specific,measurable, achievable, realistic, and time-specific (SMART) objectives(Drucker, 1954).

At the beginning of the Mini-Fellowship, participants are reminded touse Personal Action Plan components as they develop their teaching prod-uct. On the last day, participants discuss their Personal Action Plans witha coach and as part of a group and provide overall feedback on the course.Participants also complete a Personal Action Plan form (the participantsand the planning/curriculum committee each receive a copy), whichserves as baseline towards meeting the stated goals 3 months after the endof the program.

TABLE 1. Personal Action Plan guide

1. Think about how you can apply what you have learned over the past 2 days to the way you teach geriatric content and develop a plan to make one change to improve geriatrics training at your institution.a. First, identify what the product of your change will be. You should be able to describe

this product in 3–5 words (e.g., a new geriatric lecture series, a structured geriatrics clinical experience). Think of this product from the 30,000-foot level.

b. Next, describe in more detail what you would like this product to look like (e.g., 10 hours of lectures on topics including falls, incontinence, and dementia; 4 half-days a nursing home). This is looking at the product at the 50-foot level. This doesn’t need to be perfect and your coach will help you refine your thinking.

2. Once you have defined a plan, think about what steps will be needed to implement your change.a. Outline the specific objectives for your plan. Keep in mind these objectives should be

SMART (specific, measurable, achievable, realistic and time-limited)b. Identify what resources you will need to accomplish your plan. Remember,

resources can include personnel, financial requirements, leadership, support from the curriculum committee or residency director, etc.

c. Identify potential obstacles or barriers to implementing your plan.3. What are some strategies for overcoming the obstacles outlined above?4. In what ways will this change make your training program better?

Dow

nloa

ded

by [

Cen

tral

U L

ibra

ry o

f B

ucha

rest

] at

03:

26 1

2 Ju

ne 2

014

Page 9: studiu3

296 GERONTOLOGY & GERIATRICS EDUCATION

Participant feedback during Personal Action Plan discussions is usedfor continuous program improvement. For example, the Personal ActionPlan was originally introduced on the third day of the program, but learnersindicated that they needed more time to develop a plan. Consequently,Personal Action Plan components are now introduced on the first day ofthe program. Additionally, participants were taught how to develop theirown standardized patient (SP) curriculum, but participants rejected SPas too costly and time intensive. As a reaction to this feedback, SP wasreplaced with a problem-based learning workshop, a cost-effective instruc-tional method that challenges participants to learn from each other, whileworking cooperatively in groups to seek solutions to real-world problems(Barrows & Tamblyn, 1980).

Gedanken Experiment

By using a Gedanken Experiment (i.e., thought experiment) as part of itsevaluation, the UCLA Geriatric Medicine Leadership Training program(GMLT) compelled participants to think about management and leader-ship issues prior to the actual coursework, and to provide an onlineassessment of knowledge gain and behavioral change beyond mere self-report. Gedanken Experiments were used by Albert Einstein as well as psy-chologists of the Wündtian School to reflect upon scientific processes andmethods of problem solving to propel further scientific inquiry (Gardner,1997; Miller, 1999). Analogously, this technique was used at UCLA torequire GMLT trainees to reflect upon course content prior to attendingthe course. Trainees received a precourse assignment to hypothesizeabout a challenge they had faced at their workplace and to work throughthe necessary steps required to remediate it, thus creating stronger mentalhooks upon which to hang the in-person coursework.

Prior to attending GMLT, trainees completed a brief narrative descrip-tion of what they thought would occur given a particular challenge at theirworksite or clinic (Time 1). Specifically, they were asked to (1) identify aproblem that constituted a management or leadership problem that theythought could benefit from the application of geriatric medicine leadershipand management tools, skills, and knowledge; (2) identify the goals andsteps needed to remediate the problem; (3) critically analyze and describewhat would occur if they actually implemented this plan: and (4) describewhat the lessons learned might be.

During the course, trainees revised their original narratives by incorpo-rating new learning and skills obtained from GMLT (Time 2). Three

Dow

nloa

ded

by [

Cen

tral

U L

ibra

ry o

f B

ucha

rest

] at

03:

26 1

2 Ju

ne 2

014

Page 10: studiu3

Schmitt, Hu, and Bachrach 297

months after GMLT, trainees were asked to implement a remediation plan,and to critically analyze and describe their outcomes in a brief narrativereport (Time 3). To profile behavioral changes across the three assess-ment points, the UCLA research team devised a scoring rubric based ondetailed identification of the core elements provided by the training (i.e.,the learning objectives of the course) and on a demonstration of sophisti-cated critical thinking regarding management and leadership issues in ageriatric medicine context (Table 2). Key components were assignedpoint values for inclusion, degree of detail, and level of sophisticationexpressed (i.e., did trainees superficially touch upon a core element or didthey demonstrate a deeper, advanced understanding?).

The rubric-based scoring procedure revealed that GMLT traineesdemonstrated improved knowledge regarding management and leadershipissues surrounding geriatric medicine as well as an increased level ofsophistication surrounding these issues by Time 3, suggesting that pro-cessing the challenges of geriatric medicine management and leadershipissues prior to receiving intensive training in how to approach and reme-diate such challenges may result in demonstrable gains in knowledge. Inaddition, this assessment method (in concert with traditional evaluationactivities) afforded several modifications to the GMLT program. Forexample, modifications included multiple reminders to busy participants,

TABLE 2. Sample rubric scoring template

Elements Mere Mention (Score: 1 pt.)

Content Details (Score: 1 pt per relevant detail)

Critical Thinking (Score: 0–3)

1. Identify geriatric management/leadership problem

Some mention of element

• Management• Leadership• Health law• Finances• Mentoring• Bioethics• Organizational

structure

0: No evidence of sophisticated thinking

2. Identify end point (problem solution)

3. Develop action plan (steps necessary to move from problem to solution)

4. Implementation of action plan

5. Description of failure to implement plan (if necessary)

6. Lessons learned7. Mentor relationship

1: Some evidence of sophisticated thinking

2: Mostly complete, but not fully sophisticated

3: Complete; shows high level of sophisticated thinking

Dow

nloa

ded

by [

Cen

tral

U L

ibra

ry o

f B

ucha

rest

] at

03:

26 1

2 Ju

ne 2

014

Page 11: studiu3

298 GERONTOLOGY & GERIATRICS EDUCATION

a program-commencing group poster session wherein participants presenttheir revised Time 2 narrative as a brief poster to obtain group feedback,and a program concluding “booster” session in which GMLT faculty usea small group format to help trainees modify their Time 2 exercises andset more realistic expectations for their Time 3 implementation.

COMPONENTS OF LEARNER-CENTERED ASSESSMENT AND EVALUATION AS APPLIED IN THE EXAMPLES

Having behavior change as the ultimate goal, the Donald W. ReynoldsFD-AGE and the UCLA GMLT program assess learners based on self-identified learning goals and encourage the construction of knowledge bylinking these learning goals to new, already existing, and future-orientedknowledge. Both examples also include the application of what waslearned into practice and encourage reflections about the success and barri-ers of the application. The FD-AGE Mini Fellowship Program compelsparticipants to apply new knowledge and change geriatrics teaching attheir institution by developing and implementing a new teaching product.In addition, learners are encouraged to reflect on how the implementationof their product will improve teaching and how to overcome anticipatedbarriers. Conversely, GMLT expects learners to reflect on a self-chosenwork-related problem prior to the course and consider how GMLT couldsupport the solution to the problem. After the course, learners apply whatthey have learned to solve the previously identified problem and reflect onthe success of the remediation. In both examples, collaborative learning isencouraged via group discussions of the learning product and process.Furthermore, in both examples, instructors function as mediators who uti-lize assessment tools to coach and help learners improve yet allow learnersto take personal responsibility for their own learning and assessment.

Granted, both examples illustrate the integration of learner-centeredassessment components, the utilization of assessment results, and learnerfeedback to improve subsequent courses. But they do not afford mid-course corrections based on current learners’ feedback.

DISCUSSION

We described learner-centered evaluation and assessment strategiesthat give learners responsibility for their own learning to increase

Dow

nloa

ded

by [

Cen

tral

U L

ibra

ry o

f B

ucha

rest

] at

03:

26 1

2 Ju

ne 2

014

Page 12: studiu3

Schmitt, Hu, and Bachrach 299

students’ commitment to learning while providing useful informationfor continuous curriculum improvement. Although learner-centeredassessment strategies have strong face validity, no standard definitionsof these concepts exist, and little rigorous research has been conductedto document the advantage of these approaches over traditional strate-gies (Ludmerer, 2004; Suskie, 2004). For example, the positive impactof individualized written action plans on self-management skills inpatients with chronic conditions has been demonstrated, but little data isavailable on the impact of Personal Actions Plans in the educationalsetting (Gibson et al., 2002). Moreover, the lack of psychometric data,and the time and resource requirements of these approaches furtherpoint to the need for more research to assess whether learner-centeredevaluation and assessment strategies result in significantly better learn-ing, practical application, and curriculum development than traditionalmethods.

REFERENCES

Accreditation Council for Graduate Medical Education. (2000). Toolbox of assessmentmethods©. Retrieved December 7, 2007, from http://www.acgme.org/outcome

American Psychological Association [APA] Board of Educational Affairs. (1997).Learner-centered psychological principals: A framework for school reform and redesign.Retrieved January 12, 2008, from http://www.apa.org/ed/cpse/LCPP.pdf

Barlow, J., Wright, C., & Sheasby, J. (2002). Self-management approaches for peoplewith chronic conditions: A review. Patient Education and Counseling, 48, 177–187.

Barrows, H. S., & Tamblyn, R. M. (1980). Problem-based learning: An approach to medicaleducation. New York: Springer.

Bates, R. (2004). A critical analysis of evaluation practice: The Kirkpatrick model and theprinciple of beneficence. Evaluation and Program Planning, 27, 341–347.

Combs, K. L., Gibson, S. K., Hays, J. M., Saly, J., & Wendt, J. T. (2007). Enhancingcurriculum and delivery: Linking assessment to learning objectives. Assessment &Evaluation in Higher Education, 33(1), 87–102.

Drucker, P. F. (1954). The practice of management (1st ed.). New York: Harper & Row.Epstein, R. M., & Hundert, E. M. (2002). Defining and assessing professional competence.

Journal of the American Medical Association, 287, 226–235.Gardner, H. (1997). Extraordinary minds: Portraits of exceptional individuals and an

examination of our extraordinariness. London: Weidenfeld & Nicholson.Gibson, P. G., Powell, H., Coughlan, J., Wilson, A. J., Abramson, M., Haywood, P., et al.

(2002). Self-management education and regular practitioner review for adults withasthma. Cochrane Database of Systematic Reviews 2002, (2) Art. No.: CD001117.DOI: 10.1002/14651858.

Dow

nloa

ded

by [

Cen

tral

U L

ibra

ry o

f B

ucha

rest

] at

03:

26 1

2 Ju

ne 2

014

Page 13: studiu3

300 GERONTOLOGY & GERIATRICS EDUCATION

Hendry, G. D., Cumming, R. G., Lyon, P. M., & Gordon, J. (2001). Student-centred courseevaluation in a four-year problem based medical programme: Issues in collection andmanagement of feedback. Assessment & Evaluation in Higher Education, 26, 327–339.

Huba, M. E., & Freed, J. E. (2000) Learner-centered assessment on college campuses:Shifting the focus from teaching to learning. Needham Heights, MA: Allyn & Bacon.

Kirkpatrick, D. L., & Kirkpatrick, J. D. (2006). Evaluating training programs: The fourlevels (3rd ed.). San Francisco: Berrett-Koehler.

Ludmerer, K. M. (2004) Learner-centered medical education. New England Journal ofMedicine, 351, 1163–1164.

Miller, A. I. (1999). Einstein’s first steps toward general relativity: Gedanken experimentsand axiomatics. Physics in Perspectives, 1(1), 85–104.

National Academies. (2007). Assessments in science education. Retrieved December 1,2007, from http://www.nap.edu/readingroom/books/nses/html/5.html

National Commission on Accountability in Higher Education. (2005). Accountability forbetter results. A national imperative for higher education. Retrieved December 1,2007, from http://www.sheeo.org/account/accountability.pdf

Suskie, L. (2004). Assessing student learning: A common sense guide. Bolton, MA: AnkerPublishing.

Weimer, M. (2002). Learner-centered teaching: Five key changes to practice. San Francisco:Jossey-Bass.

Dow

nloa

ded

by [

Cen

tral

U L

ibra

ry o

f B

ucha

rest

] at

03:

26 1

2 Ju

ne 2

014