Program Assessment: A Process From Start to Finish RJ Ohgren – Office of Judicial Affairs Mandalyn...
-
Upload
stuart-harrington -
Category
Documents
-
view
214 -
download
0
Transcript of Program Assessment: A Process From Start to Finish RJ Ohgren – Office of Judicial Affairs Mandalyn...
Program Assessment: A Process From Start to FinishRJ Ohgren – Office of Judicial AffairsMandalyn Swanson, M.S. – Center for Assessment and Research StudiesJames Madison University
Session Outcomes
By the end of this session, attendees will be able to:
• Explain how assessment design informs program design• Describe the “Learning Assessment Cycle”• Express the difference between a goal, learning objective
and program objective• Identify effective frameworks to design learning
outcomes• Define fidelity assessment and recognize its role in the
Learning Assessment Cycle
Why Assess?
It’s simple:• The assessment cycle keeps us accountable and intentional• We want to determine if the benefits we anticipated occur • Are changes in student performance due to our program?
If we don’t assess:• Programming could be ineffective – we won’t know• Our effective program could be terminated – we have no proof
it’s working
Typical Assessment• We’ll, we’ve got to do a program. Let’s put some activities
together.• Let’s ask them questions about what we hope they get out of
it afterwards.• Um…let’s ask if they liked the program too. And let’s track
attendance.
• Survey says….well, they didn’t really learn what we’d hoped. But they liked it? And a good bit of people came? Success!
Proper Assessment
What do we want students to know, think or do as a result of this program?
• Let’s define goals and objectives that get at what we want students to know, think or do.
• What specific, measurable things could show that we’re making progress towards these goals and objectives?
• What activities can we incorporate to get at those goals and objectives?
We have a program!
Learning Assessment CycleEstablish Program
Objectives
Create & Map Programming to
Objectives
Select and Design Instrument
Implementation Fidelity
Collect Objective Information
Analyze & Maintain
Information
Use Information
Goals, Objectives, & Items
Item
Item
Item
Item
Item
Item
Item
Item
Item
Item
Item
Item
Goa
l Obj
ectiv
e
Obj
ectiv
e
Obj
ectiv
e
Obj
ectiv
e
Goals v. Objectives
• Goals can be seen as the broad, general expectations for the program• Objectives can be seen as the means by
which those goals are met• Items measure our progress towards those
objectives and goals
Goals vs. Objectives
Goal• General expectation of
student (or program) outcome• Can be broad and vague• Example: Students will
understand and/or recognize JMU alcohol and drug policies.
Objective• Statement of what
students should be able to do or how they should change developmentally as a result of the program
• More specific; measurable• Example: Upon completion
of the BTN program, 80% of students will be able to identify 2 JMU Policies relating to alcohol.
Putting it All Together
MISSION
GOAL GOAL GOAL
Objective Objective ObjectiveObjectiveObjectiveObjective
Assessment Assessment Assessment AssessmentAssessment
By The Numbers Program Goal• Goal: To provide a positive classroom experience for students
sanctioned to By the Numbers• Objective: 80% of students will report that the class met or
exceeded their expectations of the class.• Item: Class Evaluation #15 – Overall, I feel like this class…
• Objective: 80% of students will agree (or better) with the statement “the facilitators presented the material in an non-judgemental way.”• Item: Class Evaluation #5.5 – The facilitators presented the material in a
non-judgemental way. • Objective: 60% of students will report an engaging classroom
experience.• Item: Class Evaluation #5.1 – The facilitators encouraged participation.• Item: Class Evaluation #5.4 – The facilitators encouraged discussion
between participants.
1 of 3
By The Numbers Learning Goal• Goal: To ensure student understanding and/or recognition of
JMU alcohol and drug policies.• Objective: After completing BTN, 80% of students will be able to
identify 2 JMU Policies relating to alcohol.• Objective: …identify the circumstances for parental notification.• Objective: …identify the parties able to apply for amnesty in a
given situation.• Objective: …identify the geographic locations in which JMU will
address an alcohol/drug violation.• Objective: …articulate the three strike policy.
1 of 5
By The Numbers Learning Goal• Goal: To ensure student understanding and/or recognition of
concepts surrounding alcohol.• Objective: After completing BTN, 60% of students will be able to
provide the definition of a standard drink for beer, wine, and liquor.
• Objective: …identify the definition for BAC.• Objective: …describe the relationship between tolerance and
BAC.• Objective: …identify at least 2 factors that influence BAC.• Objective… identify the definition of the point of diminishing
returns.• Objective: …identify how the body processes alcohol and its
effects on the body.
2 of 5
By The Numbers Learning Goal• Goal: To ensure student understanding and/or recognition of
concepts surrounding alcohol consumption.• Objective: After completing BTN, 80% of students will be able to
correctly identify the definition of the point of diminishing returns.• Item: Assessment Question #12, #29• Activity: Tolerance Activity, Point of Diminishing Returns discussion
• Objective: After completing BTN, 80% of students will be able to identify how the body processes alcohol and its effects on the body.• Item: Assessment Question #8, #9, #10• Activity: Alcohol in the Body Activity
Developing Learning Outcomes• Should be Student Focused – Worded to express what the
student will learn, know, or do (Knowledge, Attitude, or Behavior)
• Should be Reasonable – should reflect what is possible to accomplish with the program
• Should be Measurable – “Know” and “understand” are not measurable. The action one can take from knowing or understanding is.
• Should have Success Defined – What is going to be considered passing?
Bloom’s Taxonomy
Less complex
More complex
Level Description
1. Knowledge Recognize facts, terms, and principles
2. Comprehension Explain or summarize in one’s own words
3. Application Relate previously learned material to new situations
4. Analysis Understand organizational structure of material; draw comparisons and relationships between elements
5. Synthesis Combine elements to form a new original entity
6. Evaluation Make judgments about the extent to which material satisfies criteria
Bloom’s Taxonomy
Bloom’s Level Verbs
1. Knowledge match, recognize, select, compute, define, label, name, describe
2. Comprehension restate, elaborate, identify, explain, paraphrase, summarize
3. Application give examples, apply, solve problems using, predict, demonstrate
4. Analysis outline, draw a diagram, illustrate, discriminate, subdivide
5. Synthesis compare, contrast, organize, generate, design, formulate
6. Evaluation support, interpret, criticize, judge, critique, appraise
The ABCD Method
• A = Audience• What population are you assessing?
• B = Behavior• What is expected of the participant?
• C = Conditions• Under what circumstances is the behavior to be
performed?• D = Degree• How well must the behavior be performed? To what
level?
From “How to Write Clear Objectives”
• Objective: After completing BTN, 80% of students will be able to describe the relationship between tolerance and BAC.
Audience
By the Numbers Participants
Behavior
Describe relationship between tolerance and BAC
Condition
After taking the class
Degree 80%
The ABCD Method: Example
Common Mistakes
Vague behavior• Example: Have a thorough understanding of
the university honor code.
Gibberish• Example: Have a deep awareness and
thorough humanizing grasp on…
Not Student-Focused• Example: Train students on how and where to
find information.
Fidelity Assessment • Are you doing what you say you’re doing?
• Helps to ensure your program is implemented as you intended
• Links learning outcomes to programming
• Helps to answer “why” we aren’t observing the outcomes we think we should be observing
Fidelity Components• Program Differentiation• How are the many components of your program
different from one another?• Adherence• Was your program delivered as intended?
• Quality• How well were the components administered?
• Exposure• How long did each component last? How many
students attended?• Responsiveness• Were participants engaged during the program?
Fidelity Checklist - GenericStudent Learning Outcomes
Program Component
Duration Features Adherence to Features
Quality
Objective X Component(s) aligned with Objective X
Length of component
List of specific features
(Y/N) recorded for each feature
Quality rating for each feature
What is rated?• The live/videotaped program
Who does the rating?• Independent auditors• Facilitators • Participants