1 Orientation Session at 2003 Assessment Conference A Richer and More Coherent Set of Assessment...
-
Upload
sharlene-roberts -
Category
Documents
-
view
212 -
download
0
Transcript of 1 Orientation Session at 2003 Assessment Conference A Richer and More Coherent Set of Assessment...
1
Orientation Session at 2003 Assessment
ConferenceA Richer and More Coherent Set of
Assessment PracticesPeggy L. Maki
Senior ScholarAssessing for Learning
Materials from Maki’s forthcoming book, A Framework for Building An Institutional Commitment to Assessing Student Learning, 2004, Stylus Publishing and AAHE
2
Focus of Our Assessment Efforts What do you expect your students to know
and be able to do by the end of their education at your institution?
What do the curricula and other educational experiences “add up to?”
What do you do in your classes or in your programs to promote the kinds of learning or development that the institution seeks?
3
Questions (con’d)
Which students benefit from which classroom teaching strategies or educational
experiences?
What educational processes are responsible for the intended student outcomes the institution seeks?
How can you help students make connections between classroom learning and experiences outside of the classroom?
What pedagogies/educational experiences develop knowledge, abilities, habits of mind, ways of knowing/problem solving?
4
Questions, con’d:
How are curricula and pedagogy designed to develop knowledge, abilities, habits of mind, ways of knowing?
What methods of assessment capture desired student learning--methods that align with pedagogy, content, and curricular design?
How do you intentionally build upon what each of you teaches or fosters to achieve programmatic and institutional objectives?
5
Approaches to Learning
Surface Learning
Deep Learning
6
CollaborationTechnology
ServiceLearning
WorkLife
DormLife
ServicesLearning Communities
Courses
Experiential Learning
Internship
Study Abroad
Advising
Learner
7
What Does the Sum Look Like?
Knowledge/Understanding
AbilitiesDispositions
8
“Every assessment is also based on a set of beliefs about the kinds of tasks or situations that will prompt students to say, do, or create something that demonstrates important knowledge and skills. The tasks to which students are asked to respond on an assessment are not arbitrary. “
National Research Council. Knowing what students know: The science and design of educational assessment . Washington, D.C.: National Academy Press, 2001, p. 47.
9
Assessing for Learning
Pedagogy/Instructional
Design
ContentCurricular
Design
Students’ Learning Stylesand
Histories
Assessment TaskDesigned to Ascertain
How Well Students AchieveExpected Outcome
10
Assumptions UnderlyingTeaching
Actual Practices
Assumptions UnderlyingAssessment Tasks
Actual Tasks
11
Alignment of our Outcomes
Institutional Outcomes
Programmatic Outcomes
Course Outcomes
12
When Do You Seek Evidence?
Formative—along the way? For example, to ascertain progress or development
Summative—at the end? For example, to ascertain mastery level of
achievement
13
What Tasks Elicit Learning You Desire?
Tasks that require students to select among possible answers (multiple choice test)?
Tasks that require students to construct answers (students’ problem-solving and thinking abilities)?
14
What Are Outcome Statements?
Outcome statements describe what students should know, understand, and be able to do based on how they have learned.
They emerge from what we value and how we teach; that is, they emerge from our educational practices and are developed through consensus.
15
What’s at The Center of An Outcomes Statement?
Active verbs, such as:
create
analyze
construct
apply
16
Example from ACRL:
Literate student evaluates information and its sources critically and incorporates selected information into his or her knowledge and value system.
ONE OUTCOME:
Student examines and compares information from various sources in order to evaluate validity, reliability, accuracy, timeliness, and point of view or bias.
17
Develop Rubrics to Assess Work:
Levels of achievement
Criteria that distinguish good work from poor work
Descriptions of criteria at each level of achievement
For example, mastery levels (novice to expert)
18
Evidence of Student Performance: Student work samples
Collections of student work (e.g. Portfolios)
Capstone projects
Program-embedded cases/questions
Observations of student behavior
Internal juried review of student projects
19
External juried review of student projects
Externally reviewed internship
Performance on a case study/problem
Performance on problem plus student analysis
Team-based project
20
Essay tests blind scored across units
Visual representations (graphs, charts, etc.)
Locally developed tests
Performance on national licensure examinations
Standardized tests
Pre-and post-tests
21
Interpret Results
Seek patterns
Build in institutional level and program level discourse
Tell the story that explains the results--triangulate
22
Determine what you wish to change, revise, or how you want to innovate
Implement changes
Assess to determine efficacy of changes
Focus on collective effort—what we can do
23
“What and how students learn depends to a major extent on how they think they will be assessed.”
John Biggs, Teaching for Quality Learning at University: What The Student Does. Society for Research into Higher Education & Open University Press, 1999, p. 141.