Assessment Tools for Online Courses and Programs (SUNYLA 2014)

Post on 06-May-2015

163 views 0 download

Tags:

description

Overview of rubrics that can be used to evaluate individual online courses and entire online education programs. A link to speaking notes from this presentation and an extensive bibliography of additional resources are provided in the final slides.

Transcript of Assessment Tools for Online Courses and Programs (SUNYLA 2014)

ASSESSMENT TOOLS FOR ONLINE COURSES AND PROGRAMS

K A B E L S T A N W I C K S , U N I V E R S I T Y A T A L B A N Y

S U N Y L A 2 0 1 4

QUICK POLLS

How many people have taken an online course?

How many people have taught, or are currently teaching, an online course?

How many people are planning on teaching, or would like to teach, an online course in the future?

ONLINE EDUCATION

Over the last decade, online education has continued to expand

In 2011, 32% of students in higher education were taking at least one online class (Allen & Seaman, 2013)

Involves more than just using technology to deliver instruction

ASSESSMENT NEEDS

Valid and reliable assessment tools are needed for online courses and programs

Quality assurance is a major challenge faced by higher education (Shelton, 2010)

Understanding and using e-learning assessment tools helps us understand students and maximize their potential for successful learning! (Black et al., 2008)

COURSE EVALUATION: VIRTUAL HIGH SCHOOLYamashiro & Zucker (1999)

Evaluates:

• curriculum/content

• pedagogy

• course design

• assessment

COURSE EVALUATION: CHICO STATE

Chico State (2003, 2009): http://www.csuchico.edu/roi/

Evaluates:

• learner support and resources

• online organization and design

• instructional design and delivery

• assessment and evaluation of student learning

• innovative teaching with technology

• faculty use of student feedback

Rubric provides examples for baseline, effective, and exemplary courses

COURSE EVALUATION: CHICO STATE

COURSE EVALUATION: CHICO STATE

COURSE EVALUATION: CHICO STATE

COURSE EVALUATION: CHICO STATE

COURSE EVALUATION: CHICO STATE

COURSE EVALUATION: CHICO STATE

COURSE EVALUATION: QUALITY MATTERS

Quality Matters (2011): https://www.qualitymatters.org

Evaluates (using point-based rubric):

• course overview

• learning objectives

• Assessment

• instructional materials

• learner interaction and engagement

• course technology

• learner support

• accessibility

COURSE EVALUATION: QUALITY MATTERS

COURSE EVALUATION: BLACKBOARD

Blackboard Exemplary Course Program Rubric (2013): http://www.blackboard.com/resources/catalyst-awards/BbExemplaryCourseRubric_Nov2013.pdf

Evaluates:

• course design

• interaction and collaboration

• Assessment

• learner support

Rubric uses weighted points with details on scoring courses as incomplete, promising, accomplished, and exemplary within each category.

COURSE EVALUATION: BLACKBOARD

COURSE EVALUATION: RUBRIC COMPARISONChico State ROI: Easiest rubric to use, assessment or course

design, provides detailed information for ranking

Quality Matters: Good for peer review, lacks examples detailed in other rubrics

Blackboard: Very thorough, intended for peer review but can be used for self assessment our course design, more complex to use, provides detailed information for ranking

PROGRAM EVALUATION TOOLS

Lockhart & Lacy (2002): institutional readiness and administration, faculty services, instructional design and course usability, student readiness, student services, learning outcomes, retention

Survey to Assess Student Opinions of Distance Education (Chaney, 2007): course-specific experiences and general distance education experiences

Khan & Smith (2007): institutional, management, ethical, technological, interface design, pedagogical, resource support, evaluation

Quality Scorecard for the Administration of Online Education Programs (Shelton, 2010): institutional support, course development, teaching and learning, course structure, student support, faculty support, evaluation and assessment

ADVANCED TOOLS

Template Project (Tricker et al., 2001): identify gaps between students’ requirements for course and course’s performance

Roblyer & Wiencke (2003): evaluates student interaction focusing on social-rapport, instructional design, technology resources, learner and instructor engagement

PDPP Evaluation Model (Zhang & Cheng, 2012): evaluates planning, development, process, product

QUESTIONS

Kabel Stanwicks, kstanwicks@albany.edu

This presentation is available on Slideshare: http://ow.ly/xX95Z

Speaking notes for this presentation available at: https://db.tt/voJNFajI

ADDITIONAL RESOURCESAllen, E., Seaman, J. (2013, January). Changing Course: Ten Years of Tracking Online Education in

the United States. Sloan Consortium. 

Bates, T. (2012, August 5). What’s right and what’s wrong about Coursera-style MOOCs [Web log post]. Retrieved from http://www.tonybates.ca/2012/08/05/whats-right-and-whats-wrong-about-coursera-style-moocs/

Belanger, Y., & Thornton, J. (2013). Bioelectricity: A Quantitative Approach Duke University’s First MOOC (Report). Retrieved from http://dukespace.lib.duke.edu/dspace/handle/10161/6216

Black, E. W., Ferdig, R. E., & DiPietro, M. (2008). An Overview of Evaluative Instrumentation for Virtual High Schools. American Journal of Distance Education, 22(1), 24-45.

Blackboard (2013). Blackboard Exemplary Course Program Rubric. Retrieved from http://www.blackboard.com/

California State University, Chico (2009). Rubric for Online Instruction. Retrieved from http://www.csuchico.edu/

Chaney, E. H. (2006). The Development of an Instrument to Assess Student Opinions of the Quality of Distance Education (Doctoral dissertation). Available from ProQuest Dissertations and Theses database. (304934887). 

Chaney, E. H., Eddy, J. M., Dorman, S. M., Glessner, L., Green, B. L., & Lara-Alecio, R. (2007). Development of an Instrument to Assess Student Opinions of the Quality of Distance Education Courses. American Journal of Distance Education, 21(3), 145–164.

ADDITIONAL RESOURCESCross, S. (2013). Evaluation of the OLDS MOOC curriculum design course: participant perspectives,

expectations and experiences. Retrieved from http://oro.open.ac.uk/37836/

Downes, S. (2013, March 18). Evaluating a MOOC [Web log post]. Retrieved from http://halfanhour.blogspot.ca/2013/03/evaluating-mooc.html

Khan, B., & Smith, H. (2007). A Program Satisfaction Survey Instrument for Online Students. In B. Kahn (Ed.), Flexible Learning in an Information Society (pp. 320-337). Hershey, PA: Information Science Publishing.

Legon, R. (2013, April 25). MOOCs do not represent the best of online learning (essay). Inside Higher Ed. Retrieved from http://www.insidehighered.com/views/2013/04/25/moocs-do-not-represent-best-online-learning-essay

Lewin, T. (2013a, April 29). Adapting to Blended Courses, and Finding Early Benefits. The New York Times. Retrieved from http://www.nytimes.com/2013/04/30/education/adapting-to-blended-courses-and-finding-early-benefits.html

Lewin, T. (2013b, December 10). After Setbacks, Online Courses Are Rethought. The New York Times. Retrieved from http://www.nytimes.com/2013/12/11/us/after-setbacks-online-courses-are-rethought.html

Lockhart, M., & Lacy, K. (2002). An assessment model and methods for evaluating distance education programmes. Perspectives: Policy & Practice in Higher Education, 6(4), 98–104.

McAuley, A., Stewart, B., Siemens, G., & Cormier, D. (2010). The MOOC model for digital practice. Siemens, C. G., & Downes, S. MOOC.

ADDITIONAL RESOURCESMOOC pedagogy: the challenges of developing for Coursera. (2012, August 8). Association for

Learning Technology Online Newsletter. Retrieved from http://newsletter.alt.ac.uk/2012/08/mooc-pedagogy-the-challenges-of-developing-for-coursera/

Pappano, L. (2012, November 2). Massive Open Online Courses Are Multiplying at a Rapid Pace. The New York Times. Retrieved from http://www.nytimes.com/2012/11/04/education/edlife/massive-open-online-courses-are-multiplying-at-a-rapid-pace.html

Parr, C. (2013, April 18). How was it? The UK’s first Coursera Moocs assessed. Times Higher Education. Retrieved from http://www.timeshighereducation.co.uk/news/how-was-it-the-uks-first-coursera-moocs-assessed/2003218.article

Quality Matters (2011). Quality matters rubric standards 2011-2013 edition. Retrieved from https://www.qualitymatters.org

Ripley, A. (2012, October 18). College Is Dead. Long Live College! Time. Retrieved from http://nation.time.com/2012/10/18/college-is-dead-long-live-college/

 Roblyer, M., & Wiencke, W. (2003). Design and Use of a Rubric to Assess and Encourage Interactive Qualities in Distance Courses. The American Journal of Distance Education, 17(2), 77-98.

Shelton, K. (2010). A Quality Scorecard for the Administration of Online Education Programs: A Delphi Study. Journal of Asynchronous Learning Networks, 14(4), 36–62.

ADDITIONAL RESOURCES

Shelton, K. (2011). A Review of Paradigms for Evaluating the Quality of Online Education Programs. Online Journal of Distance Learning Administration, 14(1).

Siemens, G. (n.d.). Announcing: MOOC Research Initiative [Web log post]. Retrieved from http://www.moocresearch.com/blog

Tricker, T., Rangecroft, M., Long, P., & Gilroy, P. (2001). Evaluating Distance Education Courses: the student perception. Assessment & Evaluation in Higher Education, 26(2), 165–177.

Uvalić-Trumbić, S., & Daniel, J. (2013). Making sense of MOOCs: The evolution of online learning in higher education. In Scaling up Learning for Sustained Impact (pp. 1-4). Springer Berlin Heidelberg

Yamashiro, K. & Zucker, A. (1999). An Expert Panel Review of the Quality of Virtual High School Courses: Final Report. Arlington, VA: SRI International. Retrieved from http://thevhscollaborative.org/sites/default/files/public/vhsexprt.pdf

Zhang, W. & Cheng, Y. L. (2012). Quality Assurance in E-Learning: PDPP Evaluation Model and its Application. International Review of Research in Open & Distance Learning, 13(3), 66–82.