Pet 735 presentation week 15
Transcript of Pet 735 presentation week 15
METZLER, M. W., & TJEERDSMA, B. L. (1998). PETE PROGRAM ASSESSMENT WITHIN A
DEVELOPMENT, RESEARCH, AND IMPROVEMENT FRAMEWORK. JOURNAL OF
TEACHING IN PHYSICAL EDUCATION, 17(4), 468-492.
Renee’ Brown Adam Keath
Type of, purpose of study/paper, theoretical framework/background
• The purpose of this article is to describe a development, research and improvement (DRI) framework for conducting comprehensive program assessment in PETE programs.
• Type of paper—Descriptive analysis of the improvement and assessing the DRI model.
• Examination of multiple frameworks, within the context of program evaluation.
Background• The neo reform movement began since the publication of A Nation at
Risk.– Perceptions of relationships between the quality of teachers and conditions
of public school
• Whose to blame for the inadequacies of American schools?– What is the major force in current reform efforts?– Reform efforts occurred everywhere
• Driven by NCATE, Holmes group and State legislators
• Successful to make teacher education different, but little evidence to make pre service teachers effective
Definitions• Program
– meant to be inclusive– Goodland's 7 General teacher education program areas– Most programs are completed at the end of student teaching, or
graduation. Few programs extend into the induction years. • Degree• Program Assessments
– Orphan of teacher education– It includes any regular systematic process by which a faculty designs.
• Assessment Vs. Evaluation– Usage of Stufflbeam’s definition
Program Assessment Practices in PETE
DRI Process
• Developmental stage– Thorough and honest description
• Research stage – Driven by decisions
• Decision making & improvement Stage– Combination of developmental and research stage– Determines the effectiveness of each component in promoting
student acquisition of stated programmatic goals and outcomes– Making informed and systematic decisions to maintain or improve
program components
Developmental Stage
• What are the main programmatic goals and philosophies?– Philosophy of the PETE faculty at the institution.
• What is the scope of the program’s jurisdiction and accountability?– Regulations set by external groups and agencies.
• Departments, colleges and Universities• State certification, national accreditation
• What is the programs knowledge base?– PETE programs should strive to teach pedagogical skills– Shulmans (1987)7 categories of knowledge base for teaching– NASPE developed standards for the knowledge dispositions and
performance abilities teachers should possess
Developmental Stage
• How are knowledge bases interpreted?– Knowledge bases are derived, combined, adapted or adopted from
various sources– PETE faculty need to review each construct and each item in the
knowledge base to arrive at a consensus of interpretation.• How do students acquire the programs knowledge base?
– A variety of learning experiences within the planned curriculum• Student courses, field experiences, interactions with other
students/faculty
• What is the program structure?– What is structure?– Program structures that received much attention and been adopted
in part or wholly by PETE faculty
Research Stage• What evidence demonstrates that students have learned the
knowledge base?– There can be many kinds of evidence that knowledge base has been
acquired• Ex. Observation of behavior or skills, examine written documents, student lesson
plans, course paper, assignments, etc.
– PETE faculty will use evidence that is directly related to the goals set for their program.
• What data collections techniques are available and congruent with assessment questions– Teaching and Learning Behaviors– Teacher beliefs and attitudes– Teachers motor skills, tactical knowledge, and fitness– Teachers thoughts and decision-making knowledge structures
Research Stage
• How and when is the needed evidence obtained?– Once PETE decides what evidence is needed, they
design a plan to collect data.• evidence collection depends on stated programmatic
outcomes and which of the data collection techniques will be used • PETE decides which type of program assessment is
needed to be collected– Formative vs. summative
Decision Making Stage
• How is the gathered evidence organized and used for assessment– Recommendation develop a data base, which allow
faculty to analyze individual students, subgroups, cohorts, and all students who have been in the program since data has been collected.
– Not all collected samples will be useable (e.g. videotaped session—interruptions)
– Checking procedure can help the data analysis process
Decision Making Questions• Maintain, adjust, revise or restructure the program
– Maintain– Adjust
• Ex. Change a course book, add or delete small parts of course content, fine tuning schedule.
– Revise • Ex. Instructional models, implement skill proficiency testing, re-
sequence pedagogy courses, change graduation requirements
– Restructure• Ex. Move to a professional development school model, hire new
program leadership or faculty, develop a new strategic plan, change to a post- Baccalaureate degree program
Conclusions• In order to maintain a systematic program
assessment, you need collaboration, commitment and cooperation.
• Systematic program assessment can only occur when conditions are in place and maintained
What did this paper mean to me?
• Renee- With systematic programming, there may be false starts, but persistence and dedication to the task at hand is key.
• Adam-It provided me with a base to look programing evaluation from a PETE standpoint.
Questions?
• Have these new reforms improved the quality of initially certified teachers?