WIA Closeout Handbook Presented by: Dave Rexius, EDD April Eales, EDD.
FACULTY DEVELOPMENT DAY GENERAL EDUCATION ASSESSMENT Donna Sundre, EdD Executive Director, Center...
-
Upload
della-wright -
Category
Documents
-
view
214 -
download
0
Transcript of FACULTY DEVELOPMENT DAY GENERAL EDUCATION ASSESSMENT Donna Sundre, EdD Executive Director, Center...
FACULTY DEVELOPMENT DAYGENERAL EDUCATION ASSESSMENT
Donna Sundre, EdDExecutive Director, Center for Assessment and Research Studies
(CARS)Professor of Graduate Psychology
James Madison University
Kara Siegert, PhDDirector University Analysis, Reporting, & Assessment
Salisbury University
January 21, 2010
Purpose
To encourage a discussion of common assessment misconceptions and description of the assessment process. The ultimate goals for the day are to: provide assessment resources and best practices, describe the assessment process, discuss the role SU faculty will play in developing
the assessment process at the institution, and collect feedback from faculty on assessment
strategies that they recommend for collecting data on student achievement of General Education outcomes
ITINERARY
9:00-9:15- Introductions & Itinerary 9:15-10:00- Assessment Misconceptions10:00-10:45-Assessment Process & Assessment at
SU10:45-11:30-Data Collection Methods11:30-12:00-Developing a Culture of Assessment12:00-12:30-Working Lunch-Wicomico Room12:30-1:00- Provost Allen 1:00-1:30- Questions & Introduction to Afternoon
Activity 1:30-3:00- Roundtables 3:00-4:00- Faculty Feedback
CRIMES, MISDEMEANORS, AND FELONIES THAT PREVENT A CULTURE OF ASSESSMENT
Crime, Consequence, and Rehabilitation
Crime: Practice (or non-practice) that results in the breakdown of the assessment process
Consequence: How the crime affects your assessment program
Rehabilitation: How to fix the offending behavior
Of course there are different levels of offenses; we’ve divided our examples into “misdemeanors” and “felonies”
Crime: Focus only on Weaknesses
Level: Misdemeanor I
Consequence: Faculty and administrators complain that assessment focuses on faults
Rehabilitation: Look specifically for strengths, report and publicize them; provide balanced feedback
Crime: Use of Unnecessary Jargon
Level: Misdemeanor I
Consequence: Rolling Eyes (i.e., a lack of interest and, worse, a lack of understanding of results)
Rehabilitation: Know your audience. Present at their level. Complex analyses are often useful and appropriate, but offer these in an appendix, technical report, or talk to someone after the meeting.
Crime: GE and the assessment of GE goals and outcomes are the responsibility of the faculty that teach GE only
Level: Misdemeanor I
Consequence: Faculty teaching in non-General Education courses will disengage with General Education conversations
Rehabilitation: GE includes the most fundamental skills and is therefore taught across all courses, majors, and faculty. Faculty from all disciplines should play a role in developing GE assessments.
Crime: Using Course Grades as Evidence of Student Learning
Level: Misdemeanor II
Consequence: Specific conclusions about student learning and achievement of student learning outcomes cannot be determined making it difficult to “close the loop”
Rehabilitation: Develop assessment methods and evaluation strategies that are directly aligned with learning outcomes
Crime : Forgetting that All Research has Limitations
Level: Misdemeanor II
Consequence: Faculty will question whether results are indicative of students’ true ability because Student aren’t motivated Sample was too small Test/Instrument isn’t perfect We need more analyses, data, etc
Rehabilitation: Use the assessment process and results to improve and inform the process. There will always be factors outside of our control. The key is appropriate interpretation of results; faculty should guide this.
Crime: Only Recommending Multiple-Choice Tests for Assessment
Level: Misdemeanor III
Consequence: Skeptical faculty and administrators. They are more likely to question the validity of the data.
Rehabilitation: Use the Student Learning Goals and outcomes to determine the most appropriate method of data collection.
Crime: Surprise Stakeholders with Poor Results
Level: Misdemeanor III
Consequence: Defensive faculty and administrators. They are more likely to try to undermine assessment efforts.
Rehabilitation: Share poor results informally with stakeholders first. Have them brainstorm why results turned out so. Include them in presentations.
Crime: Assessment Reports Collect Dust
Level: Felony
Consequence: Faculty will consider assessment a bureaucratic exercise invented by administrators and government for the sole purpose of torturing them.
Rehabilitation: Make sure time and resources are allotted for faculty to consider and use assessment results.
Crime: Assessment Data Reported at the Individual Faculty Level
Level: Felony (Capital Offense)
Consequence: ‘Audit’ mode confirmed; faculty assume results are being use to assess them, not programs. Expect mass hysteria and mutiny.
Rehabilitation: There may be none. Administration will need to earn respect. Allow faculty to interpret findings and suggest improvements.
Things to Consider
You already do assessment! Systematic basis for making inferences
about student development and growth Think about why you go to work
everyday—your purpose Do you see your students as your
partners in learning? What feedback from your partners would be
most beneficial for program improvement?
Final Questions
What assessment crimes have you seen committed here or at other institutions?
What assessment crimes are you most concerned might take place at SU?
How can we best assure that these misdemeanors and felonies are not committed at SU?
Other Questions, Comments, or Concerns?
ASSESSMENT PROCESS
Assoc. of American Colleges & Universities
“Almost all of the institutions surveyed (89 percent) are in some stage of either assessing or modifying their general education program. Assessment of cumulative learning outcomes in general education is, in fact, now becoming the norm.”
“Fifty-two percent of institutions are currently assessing cumulative learning outcomes in general education beyond the level of individual course grades, with another 42 percent reporting that they are planning for assessment of cumulative general education learning outcomes.”
AAC&U, 2009, Survey of 433 colleges and universities
Stages of the Assessment Process
1. Establishing Goals, Objectives, and/or Outcomes
2. Selecting or Designing Methods 3. Collecting Credible Information 4. Analyzing and Maintaining Information 5. Using Information for Teaching and
Learning Improvement
*Regardless of the level of assessment required, whether it be a single learning objective, a course, a curriculum, or an entire program, the process is the same.
Stages of the Assessment Process
EstablishingObjectives/ Outcomes
Selecting/Designing
Instruments
CollectingInformation
Analyzing/MaintainingInformation
UsingInformation
Continuous Cycle
Student Learning Goals
SKILLS1. Critical Thinking2.Command of Language 2a. Reading 2b. Writing 2c. Speaking 2d. Listening3. Quantitative Literacy4. Information Literacy 4a. Library Use 4b. Computer Technology Use5. Interpersonal Communication
DISPOSITIONS1. Social Responsibility2. Humane Values3. Intellectual Curiosity4. Aesthetic Values5. Wellness
KNOWLEDGE1. Breadth of Knowledge 1a. Arts 1b. Literature 1c. Civilization 1d. Global Issues 1e. 2nd Culture or Language 1f. Mathematics 1g. Social and Behavioral Sciences 1h. Biological and Physical Sciences2. Interdependence among Disciplines
What are Student Learning Outcomes?
OUTCOMES Specific knowledge, skills, or attitudes that
students are expected to achieve through their college experience
Describe observable behavior indicative of learning or development
Student-centered! Aligned with the GE goals and the
program’s mission Specific Measurable Attainable Reasonable
Timely
Curriculum Mapping Example
GENERAL EDUCATION
STUDENT LEARNING GOALS— General Education Student Learning Goals
RATING-Rate the level of importance of each outcome
OUTCOMES-Specific knowledge or skills students develop through their college experience
GEN ED AREA(S)-General Education Sub-group areas that provide courses for students to attain the identified outcome
SKILLS-
1. Critical Thinking 3 Assess strengths and weaknesses of arguments in essays written for general audiences.
IIA, IIB, IVB, IVC
1 Compose well-reasoned and argued responses to arguments.
IIA, IIB, IIIA, IIIB, IVC, V
4 Sythesize and apply informaton and ideas from readings across disciplines
IA, IB
Selecting/Designing Instruments
Direct measures are bestAssess the extent to which students have mastered outcomes via:
Multiple-Choice Tests Oral Presentations On-Demand Essays Course Embedded Essays Portfolios
Typically use some combination
Locating Instruments
Student Learning Goals and Outcomes/Objectives create the engine that drives assessment
Search for commercial instruments ($$) ETS, Pearson, ACT, College Base, CLA
Search for non-commercial instruments Check alignment with learning outcomes Check measurement properties-reliability and
validity
Selecting or Designing Instruments Items and asks Must Match Objectives
Create your own blueprint What is the Purpose of Assessment?
JMU Example of QR and SR Start off trying to describe level of student
learning Move toward describing growth Later establish faculty expectations for GE
completers What Type of Instruments? Validating Inferences
Cluster 3 - Learning ObjectivesItem(s) Assessing Objective
Scores
1. Describe the methods of inquiry that lead to mathematical truth and scientific knowledge and be able to distinguish science from pseudo-science.
2, 5, 9, 14, 18, 28, 38-41, 55-57 (13 items; 19.7% of test)
M = 9.25 (71% correct)SD = 1.77 α = .35
2. Use theories and models as unifying principles that help us understand natural phenomena and make predictions.
17, 20, 22, 27, 64-66 (7 items; 10.6% of test)
M = 4.61 (66% correct)SD = 1.46α = .32
3. Recognize the interdependence of applied research, basic research, and technology, and how they affect society.
1, 15, 16, 43-46 (7 items; 10.6% of test)
M = 4.51 (64% correct)SD = 1.61α = .49
4. Illustrate the interdependence between developments in science and social and ethical issues.
2, 19, 24-26, 29, 55-57 (9 items; 13.6% of test)
M = 6.47 (72% correct)SD = 1.29α = .23
5. Use graphical, symbolic, and numerical methods to analyze, organize, and interpret natural phenomenon.
4, 7, 8, 10-13, 21, 30-33, 51-53, 58-63 (21 items; 31.8% of test)
M = 13.74 (65% correct)SD = 3.06α = .59
6. Discriminate between association and causation, and identify the types of evidence used to establish causation
3, 34-37, 53, 60-63 (10 items; 15.2% of test)
M = 5.93 (59% correct)SD = 1.77α = .44
7. Formulate hypotheses, identify relevant variables, and design experiments to test hypotheses.
5, 6, 9-13, 18, 23, 28, 41, 42, 47-50, 54, 59, 60, 62, 63 (21 items; 31.8% of test)
M = 15.10 (72% correct)SD = 2.74α = .55
8. Evaluate the credibility, use, and misuse of scientific and mathematical information in scientific developments and public-policy issues.
2, 14, 24-26, 29, 38-40, 60-63 (13 items; 19.7% of test)
M = 7.96 (61% correct)SD = 1.77α = .29
Quantitative Reasoning
3, 4, 7, 8, 10-13, 21, 30-37, 51-53, 58-63 (26 items; 39.4% of test)
M = 17.58 (68% correct)SD = 3.63α = .65
Total Test 1-66
M = 46.59 (70% correct)SD = 7.34α = .78
Collecting Information
Start with an Important Question- This will guide your data collection
Cross sectional design- to begin Pre- and post-test- later
Very powerful; faculty love this design Sampling vs. census data collection
Methodology will dictate—costs, resources Course embedded
Where are the ‘natural homes’ for assessment?
Analyzing/Maintaining Information Reliability has to come first Validation of inferences is a natural partner for
any assessment question: Do course grades correlate with performances? Can we show evidence of course impact? Do students that have completed GE requirements
perform better than entering students? Are there differences by SU, AP or transfer credits? Do students achieve faculty expectations? Is there value-added?
Creating and Using Information You need an infrastructure for
Sound data collection Interpreting and creating good reports
Surprising results Identifying strengths and weaknesses
Sharing results and improving processes How can good data be used?
Improving assessment process and instruments Improving teaching & learning Academic program review Strategic planning & budgeting
Fulton School Example: History Used learning goals to develop a rubric
that is used to evaluate research papers Rubric evaluates research, analytical and
communication abilities, in general, and as they relate to the study of history in particular.
Also assisted in providing essays for GE assessment with the English department
Perdue School Example
Developed six to seven learning goals for both its undergraduate and graduate programs. Each goal has one or more measurable objectives. As of Fall 2009, methods have been developed for
assessing each learning goal. Team approach-each learning outcome assessed by
faculty members representing each discipline. Based on data collection, the Perdue School has:
made changes to the Common Body of Knowledge Exam expanded professional development opportunities to
include a 1 credit junior year course (BUAD 300) and a non-credit senior year assessment (BUAD 400) to reinforce our learning goals.
Henson School Example
Recent Assessment and Evaluation Activities with the Henson School Science General Education Requirements 2-IVA-Labs Courses 1-IVA or IVB Course (Non-lab) or IVC (Math or
COSC) Routine assessment for accredited
programs (Nursing, Respiratory Care, and Medical Lab Sciences)
Seidel School Example
Specialty Program Area Annual Report What does data show? What actions were taken based on this
data? How will assessment system change?
These reports have led to changes in Curriculum-classroom management has
been added to SCED programs Evaluation instruments-modified to better
align with program standards
Other assessment examples from your programs that you would like to share?
Are any of you stuck at a particular phase in the assessment process?
SALISBURY UNIVERSITY ASSESSMENT PROGRESS
SU’s Assessment Progress
University Academic Assessment Committee Established in 2002
Establishment of the Student Learning Goals 2000, General Education Task Force 2009-Present, Alignment with General Education
Courses Development of Student Learning Outcomes
June 2009-Present General Education Assessment
Academic Profile/MAPP/Proficiency Profile-2005 Critical thinking, written communication, information
literacy ALEKS
Academic Program Review Pilot revisions AY 2009-10
Academic Profile/MAPP/Proficiency Profile 2005
PROFICIENCY CLASSIFICATION
SKILL DIMENSION
PROFICIENT MARGINAL NOT PROFICIENT
Reading Level 1
70% (66%) 21% (20%) 8% (13%)
Reading Level 2
41% (33%) 21% (22%) 38%(45%)
Critical Thinking
7% (4%) 26% (13%) 67% (83%)
Writing Level 1 80% (68%) 16% (23%) 4% (9%)
Writing Level 2 30% (19%) 45% (38%) 25% (43%)
Writing Level 3 12% (8%) 36% (28%) 52% (64%)
Math Level 1 75% (56%) 21% (28%) 3% (16%)
Math Level 2 48% (27%) 25% (30%) 27% (43%)
Math Level 3 17% (6%) 22% (16%) 61% (78%)
*Values in parentheses represent average % of test-takers from other Master’s Level I & II institutions .
Self Study Assessment Results-2006
Direct measures Indirect measures Oral / written communication
English 101 and 102 – scoring rubric/ department assessment goals; assessments in individual courses
Alumni survey
NSSE
Scientific and quantitative reasoning
ETS (pilot project); some department assessment goals; assessments in individual courses
Alumni survey
NSSE Technological uses in the major
Departmental assessments for majors; assessments in individual courses
Alumni survey
Information literacy Dept assessment goals; individual course assessments
Alumni survey: NSSE
Critical analysis and reasoning
ETS (pilot project); some department assessment goals
Assessments in individual courses
Alumni survey
NSSE
CIRP
APR Proposed Changes: 2009-10
Removal of General Education analysis Removal of peer comparison Data pre-populated in tables Clarification & Training Electronic creation and submission Rubric-based feedback provided to programs Reviewing assessment progress periodically
October review 3-year Assessment Plan & Summary Preview
Fulton School curriculum reform APR guidelines
Academic Program Review
PART I- Assessment Plan and Summary Program Description Student Learning Goals, Outcomes, and/or Objectives Assessment Method(s) Data Results and Use Assessment Action Plan
PART II- Program Review and Action Plan Internal Review and Qualitative Analysis
Summary Program Curriculum and Advising Resources
External Review Summary Recommendations Action Plan
METHODS OF DATA COLLECTION
Not Just Any Data Will Do…
If we want faculty to pay attention to the results, we need credible evidence
To obtain credible evidence: We need a representative sample or a
census We need good instrumentation
The tasks demanded must represent the content domain
Reliability and validity We need students who are motivated to
perform
Prerequisites for Quality Assessment
We must have three important components
Excellence in sampling of students
Either large, representative student samples or a census
Sound assessment instrumentation
Psychometrically sound assessment methods that map to the domain
Instruments and methods that faculty find meaningful
Motivated students to participate in assessment activities
Can we tell if students are motivated?
Can we influence examinee motivation?
Data Collection Methods
Course-Embedded Grand Valley State University
Portfolios College of William and Mary George Mason University
Assessment Days St. Mary’s University Christopher Newport University James Madison University
Assessment Season Truman State University
Course-Embedded
Courses serve as data collection venue Focused assignments are integral to
courses; evaluated as part of course grade using common scoring procedure
ADVANTAGES DISADVANTAGES
Requires no extra “collection” period
Requires course time—intrusive, hard to implement well
Increased student motivation Requires sound sampling plan
Reduced costs Requires ‘common’ assignment and scoring across multiple courses
Faculty-driven Requires additional faculty scoring
Portfolios
Student developed vs. Instructor compiled
Contain samples that demonstrate attainment of specific GE goals and outcomes
Rubric-based evaluation of samplesADVANTAGES DISADVANTAGES
Can be used to evaluate improvement
Scoring can be time consuming
Can evaluate more complex, process-oriented skills
Evaluation method must be explicitly stated to ensure proper evidence is provided
Assessment Days Two institution-wide Assessment Days
Fall (August): Incoming freshmen tested at orientation Spring (February): Students with 45-70 credits ; typically the
sophomore year
Classes are cancelled on this day All students are required to participate, else course
registration is blocked Students are randomly assigned to take a particular series
of instruments JMU just completed its 23rd Spring Assessment
Day Spring Day is used by many majors to collect data on
graduating seniors
ADVANTAGES DISADVANTAGES
Data collection requires no course time
Consider examinee motivation
Makes assessment an institution-wide commitment; improves greatly over time
Requires institutional commitment; faculty will react poorly at first
Creates a culture of assessment
Additional costs for proctors or faculty
Assessment Season
2-4 week testing window where instruments are offered for completion
Students assigned to certain tests based on a sampling approach
ADVANTAGES DISADVANTAGES
No course time required Motivation needs to be examined
Allows for an extended evaluation period
Additional cost to proctor exams
Makes assessment an institution-wide commitment
Requires students to attend session outside of classroom time
DEVELOPING A CULTURE OF ASSESSMENT
The Assessment Culture at JMU
JMU requires students to take a series of student outcomes assessments prior to their graduation. These assessments are held at four stages of students’ academic careers:
as entering first-year students at the mid-undergraduate point when they have earned
45 to 70 credit hours, typically the sophomore year as graduating seniors in their academic major(s) Students will also complete an alumni survey after
graduation
-JMU Undergraduate Catalog
The Assessment Culture at JMU
Long-standing and pervasive expectation at JMU that assessment findings will guide decision-making. Annual reports, Assessment Progress
Templates, program change proposals, and all academic program review self-study documents all require substantial descriptions of how Assessment guides decision-making
The Center for Assessment and Research Studies (CARS) is the largest higher education assessment center in the US with 10 Faculty, 3 Support Staff, and about 15
Graduate Assistants at the Masters and PhD level
The Assessment Culture at JMU
CARS supports all general education assessment
CARS facilitates all JMU alumni surveys CARS supports assessment for every
academic program Undergraduate and Graduate
CARS supports assessment for the Division of Student Affairs
All programs must collect and report on assessment data annually
Academic Program Reviews are scheduled Every 6 years for ALL academic degree programs Every 5 years for General Education ‘clusters’
How do we develop a culture at SU?
Pathway for Institution-Wide Assessment Development
Vision
High Standards
Commitment
Resource
s
Structure
Integratio
nHow assessment
can help meet the mission &
what we want to achieve
with assessment.
Measure well what
matters, not what is easy
to count.
Unswerving commitment
that withstands economic
challenges & changes in leadership
Time and monetary
resources are investments
to ensure student
learning and development
Institutional committees with faculty
and administrator
s to inform process,
share, & use findings.
Integration at all stages to help build a “culture of
evidence” to inform &
strengthen decisions
LUNCH
PROVOSTDR. DIANE ALLEN
GENERAL EDUCATION OUTCOMES WORK GROUPS:
1. CRITICAL THINKING2. COMMAND OF LANGUAGE-WRITING3. QUANTITATIVE LITERACY4. INFORMATION LITERACY- ACCESS INFORMATION EFFICIENTLY, EVALUATE IT CRITICALLY, AND USE IT APPROPRIATELY5. INTERPERSONAL COMMUNICATION
What’s Next?
1. Provide draft GE outcomes to department chairs and request feedback-February 2010
2. Hold open faculty meeting to request feedback on the draft GE outcomes-March/April 2010
3. Present Faculty Senate with draft outcomes and finalize outcomes for a vote-April 2010
4. Use final GE outcomes & information provided at the FDD roundtables to inform UAAC on the development of an institution-wide GE assessment process-Draft Plan-Fall 2010