Rooted in Research: Establishing Coherent Partnerships between Institutional Research and the...
-
Upload
smarterservices-owen -
Category
Education
-
view
81 -
download
3
Transcript of Rooted in Research: Establishing Coherent Partnerships between Institutional Research and the...
Establishing Coherent Partnerships between Institutional Research and
the Quality Enhancement Plan
Dr. Ghazala Hashmi, Coordinator of the Quality Enhancement Plan
Dr. Jackie Bourque, Director, Office of Institutional EffectivenessJ. Sargeant Reynolds Community College, Richmond, Virginia
SACS Annual Conference Orlando, Florida December 2011
The Session’s Goals
• Present J. Sargeant Reynolds Community College’s use of institutional research to1) identify, develop, and implement its QEP, and
2) bring a variety of college units into effective partnerships.
• Share the challenges and the successes the College has encountered in gathering data and in making effective use of data to improve student learning in distance education.
Outcomes of this Session
By the end of this session we hope you will be able to• Identify best practices in effectively gathering and using
institutional data points for the selection, development, and implementation of a QEP topic.
• Develop effective, collaborative partnerships between the QEP Team and the Office of Institutional Research.
• Share with your colleagues templates for gathering the grounding institutional data that helps to guide QEP selection teams towards an effective QEP topic, to develop the substance of the QEP, and to assess the effectiveness of the QEP during
implementation.
Student Success
in
DistanceLearning
StudentOrientation & Student Support
Faculty Training
StudentReadiness
The Ripple Effect: Transforming Student Success in Distance Learning, One Student & One Instructor at a Time
The Quality Enhancement Plan at J. Sargeant Reynolds Community College
The Ripple Effect
QEP Topic Selection
Digging into Data
Data Profile Template
Growth in Number of Distance Learning Enrollments
1999-2000 2000-01 2001-02 2002-03 2003-04 2004-05 2005-06 2006-07 2007-08
Summer 751 856 984 1079 1282 1529 1979 2132 2437
Fall 1049 1140 1223 1584 1814 2175 2524 2817 3414
Spring 1085 1189 1463 1680 2086 2462 2710 2802 3393
TOTAL 2885 3185 3670 4343 5182 6166 7203 7768 9244
0100020003000400050006000700080009000
10000
Summer
Fall
Spring
TOTAL
Growth in Number of Distance Learning Sections
1999-2000 2000-01 2001-02 2002-03 2003-04 2004-05 2005-06 2006-07 2007-08
Summer 65 75 83 80 97 109 136142 144
Fall 83 108 111 115 138 148 167176
193
Spring 95 103 119 121 139 166 172185
198
TOTAL 243 286 313 316 374 423 475502 534
0
100
200
300
400
500
600
Summer
Fall
Spring
TOTAL
0
20
40
60
80
100
120
2003 - 2004 2006 - 2007 2008 - 2009 2009 - 2010
Full-Time
Part-Time
Increasing Numbers of Faculty Teaching Distance Learning Sections
Student Success Rates in DL Classes Original Template
Student Success Rates in Face-to-Face Classes Original Template
Student Persistence Rates in Face-to-Face versus Distance Learning Classes
An Example of the New Data Template
Discipline/
Domain
On-Campus, Face-to-Face Distance Learning
p< .10 Disadvantage
Persisted Withdrew Persisted Withdrew
N % N % N % N %
ACC -
Accounting
ACC 115 86 88.7% 11 11.3% 57 76.0% 18 24.0% * Online
ACC 211 382 85.3% 66 14.7% 147 90.2% 16 9.8%
ACC 212 272 95.4% 13 4.6% 131 94.9% 7 5.1%
ADJ –
Administration
of Justice
ADJ 100 33 100.0% 0 0.0% 33 97.1% 1 2.9%
ADJ 105 31 86.1% 5 13.9% 60 89.6% 7 10.4%
ADJ 201 33 100.0% 0 0.0% 29 93.5% 2 6.5%
ART - Art
ART 100 57 89.1% 7 10.9% 330 88.7% 42 11.3%
ART 101 37 92.5% 3 7.5% 29 70.7% 12 29.3% * Online
ART 102 17 70.8% 7 29.2% 55 93.2% 4 6.8% * Regular
ART 106 17 94.4% 1 5.6% 41 97.6% 1 2.4%
ASL – American
Sign Language
ASL 201 38 90.5% 4 9.5% 8 100.0% 0 0.0%
BIO - Biology
BIO 101 1051 91.9% 93 8.1% 98 91.6% 9 8.4%
BIO 106 113 91.1% 11 8.9% 53 82.8% 11 17.2% * Online
QEP Development
Defining a Data-driven Plan
Growing the Plan
External Research
Best practices: Identifying national standards relevant to our QEP Topic
Other institutional efforts: Identifying other institutional efforts, particularly at compatible colleges, similar to our own
A review of the literature: Developing annotated bibliographies in order to present an academic review of the material
Growing the PlanInternal ResearchEvaluating topical data:
Distance Learning Student Survey and Report – what do our students think about course design; who are our online students; what are the barriers to their success?
Discipline Review of Online Courses demonstrating broad gaps between face-to-face and online student success rates
Consulting faculty:
Distance Learning Faculty Focus Group – what are their perceptions; what do they consider to be the barriers to student success?
Faculty training needs – evaluating needs in technology training and instruction in pedagogy, course design, assessment of student learning.
Evaluating general college data:
Continued evaluation of general college data: enrollments, success rates, persistence rates
An Example of Data Collected and Evaluated:FTES Comparison - Fall 2009 with Fall 2008
by campus
Campus Fall 2008 10/20/08
Fall 2009 10/19/09
FTES Change
% Change
Campus One 3,060.93 3,375.27 314.33 10.27
Campus Two 1,692.20 1,867.67 175.47 10.37
Campus Three 228.93 250.87 21.93 9.58
Off-Campus 12.00 9.60 -2.40 -20.00
Off-Campus 830.60 482.27 -348.33 -41.94
Virtual 792.00 1,056.20 264.20 33.36
Unknown 12.40 0.00 -12.40 -100.00
Total 6,629.07 7,041.87 412.80 6.23
QEP Implementation
Driving, Detouring, and Documenting
The Driver: the QEP Assessment Plan
Using the Emerging Data to Drive the Plan
Student Readiness –1. A student profile emerges through SmarterMeasure.2. We evaluate the relationships between this profile and
student success.3. We construct our distance learning orientation around
institutional data.
Student Orientation –1. Students and faculty provide qualitative feedback.2. We evaluate success of orientation by measuring impact on
students.3. We modify the orientation based on data
The Driver: the QEP Assessment Plan (continued)
Faculty Training –
1) Faculty provide self-assessments of their own skills and understanding of course design and online teaching
2) Ongoing peer-to-peer reviews provide qualitative and quantitative data.
3) We evaluate student success and student persistence rates of trained and untrained faculty.
4) Faculty provide feedback about the impact of the training.
5) We assess our own training services through the modules that have been designed and delivered.
6) The QEP Team makes modifications based on the results of the data.
Partnerships in Institutional Research
QEP Coordinator
Faculty Member
with online experience
Office of Institutional Effectiveness
• Research Analyst
Office of Student Affairs
QEP Assistant Coordinator
Office of Professional
Development
Technology Training
Office of Academic
Affairs
Center for Distance
Learning
ExecutiveVice President
The Detours
Peer Academic Leaders (PALs) Program
Rewards and Recognitions Program
New Faculty Certification for Distance Learning Policy
New Communications Tools
The New Faculty Development Database
Assessments leads into new sprouts.
Leaving room for growth and expansion into new territories has been both challenging and rewarding.
Peer Academic Leaders (PALs) Program
Although PALs developed from the QEP and from a limited, on-campus program, it presented new challenges:
• Funding
• Recruitment and Training of Peer Leaders
• Administrative Oversight
• New Marketing
• New Assessment Tools and Activities
New Communications Tools
The
QEP
Blog
A New Faculty Development Databasethat also integrates HRMSIS and the Knowledge Center
Fall 2010 Statistics
Number of Current Distance Learning Faculty
Number of CurrentDistance Learning Faculty who have completed either Tier One or Tier Two Training
Number of CurrentDistance Learning Faculty who have not completed either Tier One or Tier Two Training
153 48 105
0
20
40
60
80
100
120
Current Distance Learning Faculty
Number of Current DL Faculty who have completed either TOP or IDOL
Number of Current DL Faculty who have not completed either TOP or IDOL
The Documentationusing data to support QEP initiatives
• Reporting to College Executives: WEAVE Online
• Reporting to Broader QEP Team: SharePoint
• Annual Reports to general college audience
• Regular summaries and reports of ongoing assessment efforts and results on public blog
• Ongoing presentations to internal audiences
Digging, Driving, DocumentingIn summary, we have found that the ongoing effectiveness of and enthusiasm for the QEP is built upon three primary factors, and they all relate to the research of the QEP:
Digging into the data (gathering, evaluating, discussing)
Driving with data as the guide (building, daydreaming, detouring)
Documenting the data (communicating and sharing)
for more information
Jackie BourqueDirector, Office of Institutional [email protected]
Ghazala HashmiCoordinator, Quality Enhancement [email protected]
www.reynolds.edu/qep