Evaluation in Education

download Evaluation in Education

If you can't read please download the document

description

Evaluation -What it is? -Definitions Why it is needed ? - Purposes How it is done ? - Levels Criteris Types Techniques Methods

Transcript of Evaluation in Education

  • 1. EVALUATION IN EDUCATION Dr. Kusum Gaur Asso. Prof. PSM WHO Fellow for IEC 09/07/10

2. Is that web site good enough to surf? 3. This one looks good, but Howcan I ll be sure? 4. Think of CARRDSS

  • C REDIBILITY / AUTHORITY
  • A CCURACY
  • R ELIABILITY
  • R ELEVANCE
  • D ATE
  • S OURCES BEHIND THE TEXT
  • S COPE AND PURPOSE

5. There are other questions in life!

    • Which car should I buy?
    • Should I take this medication?
    • Should my child have this surgery?

6. Just as we evaluateour website to surf . . .

  • We have to take decisions
  • on the basis of some
  • Criteria

7. Evaluation is important! To draw conclusionsandmake new predictions. 8. Evaluation

  • What is evaluation?
  • Why to evaluate?
  • How it is done?

9. What is evaluation?

    • Evaluation is systemic determination of merit, worth and significance of something or someone using criteria against a set of standard. (en.wikippedia.org/wiki/evaluation)
    • Evaluation is to draw; to assess; to compute an expression(en.wikitionary.org/wiki/evaluation)
    • Evaluation is the process of making judgment based on criteria and evidence. (www.sbctc.edu/College/e-assessglossary)

10. What is evaluation?

  • Evaluation is the process of examining a subject and rating it based on
  • its important features
  • i.e. criteria

11. Why to Evaluate

  • A ccountability
  • V alidating our hypothesis
  • C omparison
  • K nowing Status
  • K nowing Needs
  • P lanning further
  • P URPOSE ...........

12. Why to Evaluate.

  • M easures theeffectiveness of the instructor
  • M easures theeffectiveness of impactin meeting objectives
  • P rovidefeedbackto students
  • P rovides studentsgratification and motivation ...........

13. To be KnownbeforeHow to EVALUATE ? For Quality Evaluation 14. Quality Evaluation Criteria Questions Techniques M e thods Quality How the evaluationwill be done 15. Criteria

  • Certain standard
  • on which the
  • achievements of a learner
  • is measured

16. Good Quality Criteria

  • V alidity
    • R eliability
  • R eproducibility
  • S ensitive
  • S pecific

(VRRSS) 17. Types of Evaluation

  • As per Quality of Evaluation
    • Q uantitative
    • Q ualitative
  • As per Time of Evaluation
    • F ormative
    • S ummative
    • P re P ost Evaluation

18. Quality of Evaluation

  • Quantitative Evaluation
    • Provide a quantifiable objective measure
    • Expressed in proportions
    • Example:
      • How many students have got >60%?
  • Qualitative Evaluation
    • Communicate general expectations
    • Expressed in grading
    • Open to interpretation
    • Examples:
      • What about his socio-economic status ?

19. Time of Evaluation

  • Formative Evaluation
    • Ongoing evaluation during an instructional period
    • To know the perceptions of the students in comparison to instructor .
  • Summative Evaluation
    • Conducted at the end course.
    • Purpose is to form a judgment about
        • Performance of student
        • Effectiveness of an instructor
        • Effectiveness of the course
    • Regularly scheduled at the end of academic terms.
    • Pre and postEvaluation

20. Formative v/s Summative Evaluation Quality Formative Summative Purpose detect strengths & weakness Overall achievements Frequency During or end of unitIn end point of certification, promotion Area covered One unit/no. of units Course content Administrative utility Advisory, not always for permanent record Decisive, for permanent record Feedback to students Done immediately Inform regarding pass or fail Feedback to faculty If significant no. shows error than weakness in instruction Overall pass or fail 21. Pre- and Post-Evaluation

    • Evaluate in the beginning to asses needs
    • Evaluate in the end to assess outcome
    • To assess degree of achievement ofobjectives through pre-post evaluation.

22. Questions Evaluation Questions should IncludeEach level of Evaluation 23. * Levels of Evaluation

    • Level I Reaction
        • H ow did the student react to the class?
    • Level- IILearning
        • W hat has the student learned?
    • Level III Skill
        • H ow much did the student retain?
    • Level IVImpact
        • W hat is the final impact or practical application of this learning?

24. Techniques of Evaluation

    • T eaching dossiers
    • S tudent ratings
    • P eer observations
    • I nterviews
    • P ortfolios
    • C lassroomA ssessment
  • Evaluation should use acombination of strategiesto take advantage of theirinherent strengthsas well as theirindividual limitations .

25. Techniques of Evaluation

  • Teaching Dossiers
    • Usually done in the end of unit and in the form of some written document.
    • As an submission assessment of students
  • Student Rating
    • Rating of teaching by students
  • P eer observations
    • Colleaguesjudge teaching quality by observation
    • Peer observation is usually useful for formative evaluation
  • I nterviews
    • used in teaching award nominations
  • P ortfolios
    • Appropriateness of course goals and objectives
  • C lassroomA ssessment
    • Effectiveness of teaching on Learning
    • can be used in a timely way to help instructors identify gaps

26. Methodologies

    • Project Assignments
    • Observation
        • Simple Observation
        • Think-aloud (let the examinee explain)
        • Constructive Interaction
    • Query via
        • Interviews
        • Focal Group Discussion (FGD)
        • Questionnaires
  • Combinations of methodologies should be used

27. Summary of Methodologies TypeStrengths Weaknesses Project Assignment Creativity Skill Judgment Time Consuming Analysis is Difficult Observational Skill Judgment In Depth Less Planning require Time Consuming Analysis is Difficult Interview Depth High Response Communication skill Extensive Planning Time Consuming Analysis Difficult Focus group discussion (FGD) Group Energy Communication skill Interpersonal Relationship Extensive Planning Analysis Difficult Questionnaire Extensive in Use In Depth Easy Analysis Extensive Planning 28. Styles of Questions

    • Open-ended questions
    • Closed
    • Scalar
    • Multi-choice
    • Ranked
    • Combining open-ended and closed questions

29. Open-ended questions

    • Asks for unprompted opinions
    • Good for general subjective information
    • But difficult to analyse
    • eg:Can you suggest any improvements to the present health care delivery system?

30. Closed questions

    • Restricts the respondents responses by supplying alternative answers
    • Can be easily analyzed
    • But watch out for hard to interpret responses!
    • Alternative answers should be very specific

31. Scalar

    • Ask user to judge a specific statement on a numeric scale
    • Scale usually corresponds with agreement or disagreement with a statement

32. Multi-choice

    • Respondent offered a choice of explicit responses
    • e.g. Which types of presentation is this.?
        • Microsoft word
        • Microsoft power point
        • Microsoft excel
        • Microsoft outlook

33. Ranked

    • Respondent places an ordering on items in a list
    • Useful to indicate a users preferences
    • e.g. Rank the usefulness of 108 services in health sector.
        • Useful
        • Do not know
        • Of no use

34. Combining open-ended and closed questions

    • Gets specific response, but allows room for users opinion
    • e.g. Compare and comment on IMR in developing with developed country

35. Tests

    • Written
          • Questionnaire
    • Practical
          • Project Assignment, Case Study, Observation, Interview, FGD
    • Oral
          • Interview, FGD

36. Written Tests

        • Multiple Choice
        • True/False
        • Matching
        • Completion or fill in the blank
        • Essay
          • Short Notes
          • Long Essay

37. Multiple Choice Tests

    • Common method
    • Easy to grade and be objective
    • Test construction miscues:
        • Using previous questions information
        • Negatively worded stems
        • Fill in the blank in middle of stem
        • Using all of the above or none of the above

38. True/False Tests

    • Limited to two answers, no gray area
    • Difficult to construct in positive voice
    • Avoid always or never statements
    • Useful tool as a study guide

39. Matching Tests

    • Works best with definitions and terms
    • Difficult to design
    • Cautious of multiple matches
    • Test directions must be clear

40. Completion Tests

    • Fill in the blank
    • Statements must be clear as to intent of question
    • Needto be grammatically correct
    • Be aware of size of blank
    • Avoid having blank at beginning of sentence

41. Essay Tests

    • May require long or short answer
    • Time consuming and difficult to grade
    • Recommended to grade in a group format
    • Hand written exams must allow sufficient time
    • Shotgun approach.. As much informationas possible in hopes of hitting the target

42. Oral Exams

    • Requires verbal answers by students
    • Advantages
        • Evaluate quick reaction of student
        • Assesses the student thought process
    • Disadvantages
        • Limited number of students examined at one time
        • Difficult to standardize
        • Time consuming and labor intensive
        • Unexpected distractions
        • Unfair emphasis on repeated mistakes

43. Project Assignments

    • Gets students working outside the class
    • In groups, helps develop people skills
    • Negatives
      • Hard to standardize
      • Potential plagiarism
      • May measure only end product and not consider the process

44. Practical Exams

    • Demonstration of a skill in the context of a scenario
    • Demonstration of steps of performing a skill

45. How to EVALUATION Process of Evaluation 46. Process of Evaluation (9 Steps) Planning Implementation Feedback Step 1. Define Purpose & Scope Step 2. Specify Evaluation Question Step 3. Specify Evaluation Design Step 4. Create Data Collection Action Plan Step 5. Data Collection Step 6. Data analysis Step 7. Finding data Inference Step 8. Disseminate InformationStep 9. Feedback for improvement 47. Data Collection Action Plan

    • D ecide level of Evaluation
    • T ype of Evaluation
    • T ype of Evaluation Technique
    • W hat Evaluation Questions ?
    • F rom whom/Data Sources?
    • B y whom?
    • W hen Collected?
    • H ow Collected ?
    • H ow Data are to be Analyzed ?

48. Accreditation Council for Graduate Medical Education (ACGME)

  • Six General Competencies:
    • Patient Care
    • Medical Knowledge
    • Practice-Based Learning
    • Interpersonal and Communication Skills
    • Professionalism
    • Systems-Based Practice

49. Choosing an Evaluation Method

  • why in process: Purpose of evaluation
  • When in process: Formative vs. Summative
  • style of evaluation: laboratory vs. field
  • type of measures: qualitative vs. quantitative
  • level of information: high level vs. low level
  • resources available: time, subjects, tools &equipment, expertise etc .

50. Example

  • Which type of test is most commonly used for state or national certifications?
        • True/False
        • Matching
        • Multiple choice
        • Fill in the blank

51. Goal of Evaluation Status Evaluation Feedback 52. If you can not measure it, you can not improve it. so Evaluation is important Dr. Kusum Gaur Asso. Prof. PSM WHO Fellow IEC 53. Thanks