Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution....

72
ED 289 969 AUTHOR TITLE INSTITUTION SPONS AGENCY REPORT NO PUB DATE NOTE AVAILABLE FROM PUB TYPE EDRS PRICE DESCRIPTORS DOCUMENT RESUME CE V8 702 Kopp, Kathleen Evaluate the Performance of Adults. Module N-6 of Category N--Teaching Adults. Professional Teacher Education Module Series. Ohio State Univ., Columbus. National Center for Research, in Vocational Education. Department of Education, Washington, DC. ISBN-0-89606-237-6 87 75p.; For related documents, see ED 274 810 and CE 048 697701. American Association for Vocational Instructional Material, 120 Driftmier Engineering Center, University of Georgia, Athens, GA 30602. Guides - Classroom Use - Materials (For Learner) (051) MF01/PC03 Plus Postage. *Adult Education; Adult Educators; Adult Students; Behavioral Objectives; *Competency Based Teacher Education; Higher Education; Individualized Instruction; Job Skills; Learning Activities; Learning Modules; Secondary Education; *Student Evaluation; Teacher Evaluation; Teaching Skills; *Test Construction; *Testing; Vocational Education; *Vocational Education Teachers ABSTRACT This module, one in a series of performance-based teacher education learning packages, focuses on a specific skill that vocational educators need to create appropriate learning environments and to plan end manage instruction that is well-suited to the learning and psychological needs of today's adults. The purpose of the module is to teach instructors how to evaluate learners' progress in meeting specified objectives. Introductory material provides terminal and enabling objectives, a list of resources, and general information. The main portion of the module includes three learning experiences based on the enabling objectives: (1) demonstrate knowledge of the rationale for and factors involved in student evaluation; (2) critique a test; and (3) critique a case study on developing evaluation plans for adult learners. Each learning experience presents activities with information sheets, samples, worksheets, checklists, and self-checks with model answers. Optional activities are provided. Completion of these three learning experiences should lead to achievement of the terminal objective through the fourth and final learning experience that requires (1) an actual teaching situation in which to evaluate the performance of adults, and (2) a teacher performance assessment by a resource person. An assessment form is included. (YLB) *********************************************************************** Reproductions supplied by EDRS are the best that can be made from the original document. ***********************************************************************

Transcript of Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution....

Page 1: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

ED 289 969

AUTHORTITLE

INSTITUTION

SPONS AGENCYREPORT NOPUB DATENOTE

AVAILABLE FROM

PUB TYPE

EDRS PRICEDESCRIPTORS

DOCUMENT RESUME

CE V8 702

Kopp, KathleenEvaluate the Performance of Adults. Module N-6 ofCategory N--Teaching Adults. Professional TeacherEducation Module Series.Ohio State Univ., Columbus. National Center forResearch, in Vocational Education.Department of Education, Washington, DC.ISBN-0-89606-237-68775p.; For related documents, see ED 274 810 and CE048 697701.American Association for Vocational InstructionalMaterial, 120 Driftmier Engineering Center,University of Georgia, Athens, GA 30602.Guides - Classroom Use - Materials (For Learner)(051)

MF01/PC03 Plus Postage.*Adult Education; Adult Educators; Adult Students;Behavioral Objectives; *Competency Based TeacherEducation; Higher Education; IndividualizedInstruction; Job Skills; Learning Activities;Learning Modules; Secondary Education; *StudentEvaluation; Teacher Evaluation; Teaching Skills;*Test Construction; *Testing; Vocational Education;*Vocational Education Teachers

ABSTRACTThis module, one in a series of performance-based

teacher education learning packages, focuses on a specific skill thatvocational educators need to create appropriate learning environmentsand to plan end manage instruction that is well-suited to thelearning and psychological needs of today's adults. The purpose ofthe module is to teach instructors how to evaluate learners' progressin meeting specified objectives. Introductory material providesterminal and enabling objectives, a list of resources, and generalinformation. The main portion of the module includes three learningexperiences based on the enabling objectives: (1) demonstrateknowledge of the rationale for and factors involved in studentevaluation; (2) critique a test; and (3) critique a case study ondeveloping evaluation plans for adult learners. Each learningexperience presents activities with information sheets, samples,worksheets, checklists, and self-checks with model answers. Optionalactivities are provided. Completion of these three learningexperiences should lead to achievement of the terminal objectivethrough the fourth and final learning experience that requires (1) anactual teaching situation in which to evaluate the performance ofadults, and (2) a teacher performance assessment by a resourceperson. An assessment form is included. (YLB)

***********************************************************************

Reproductions supplied by EDRS are the best that can be madefrom the original document.

***********************************************************************

Page 2: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Evaluate the Pertotroance

POWs

7\ /111M1111A"

U.S. DEPARTMENTOF EDUCATIONOffice Of Educational

Research and ImprovementEC:1U ATIONAL RESOURCES

INFORMATIONCENTER (ERIC)

rechis document has been reproduced aseived from the person or organizationOriginating it

0 Minorproduc

changes have been made to improveretion quality

Points of view or opinionsstated in this docu-ment do not necessarilyrepresent OfficialOER: position Or policy

wooli telotom.tooIL;96.splcisstiok.E.

couffleus.01.40

400

BEST COPY AVAILABLE04

TIME

"PERMISSIONgAL:T: 1.1 C1.1

Ash

: :UN:REPRODUCE

D0 B Y

\v"(11SelUIRII?Cells°114°IkGLelaRIFIAALSSDEEN

ROD- HIS

TO THE EDUCATIONAL RESOURCESINFORMATION CENTER (ERIC)."

Page 3: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

FOREWORDThis module is one of a series of over 130 performance-basedteacher education (PBTE) learning packages focusing uponspecific professional competencies of occupational instructors(teachers, trainers). The competencies upon which these modulesare based were identified and verified through research as be-ing important to successful teaching. The modules are suitablefor the preparation of instructors in all occupational areas.

Each mudule provides learning experiences that integrate theoryand application, each culminates with criterion-referenced assess-ment of the instructor's performance of the specified competen-cy. The materials are designed for use by teachers-in-trainingworking individually or in groups under the direction and with theassistance of teacher educators or others qualified to act asresource persons Resource persons should be skilled in theteacher competencies being developed and should be thoroughlyoriented to PBTE concepts and procedures before using thesematerials.

The design of the materials provides considerable flexibility forplanning and conducting performance-based Rai: ing programsfor preservice and inservice instructors, as well as business-industry-labor trainers, to meet a wide variety of individual needsand interests. The materials are intended for use by local educa-tion agencies, postsecondary institutions, state departments ofeducation, universities and colleges, and others responsible forthe professional development of instructors.

The PBTE modules in Category NTeaching Adultsare de-signed to enable adult instructors to create appropriate learningenvironments and to plan and manage irltruction that is wellsuited to the learning and psychological . ,ads of today's adultsThe modules are based upon 50 competencies identified andverified as unique and important to the instruction of adu'ts.

Many individuals have contributed to the research, development,field review, and revision of these training materials. Apprecia-tion is extended to the following individuals who, as membersof the DACUM analysis panel, assisted National Center staff inthe identification of the competency statements upon which thiscategory of modules is based: Doe Hentschel, State Universityof New York at Brockport; David Holmes, Consortium of the

California State University, Joanne Jorz, JWK International Cor-poration, Virginia, Jean Lowe, Fairfax County Public Schools,Virginia, Jim Menapace, BOC/Lansing-General Motor',Michigan, Norma Milanovich, University of New Mexico, CubaMiller, Sequoia Adult School, California, Donald Mocker, Univer-sity of Missouri, and Michael A. Spewock, Indiana Univer,lity ofPannsylvania.

.'.ppreciation is also extended to the following individuals for theircritical field reviews of the six modules in the N category duringthe development process. Edward K. Allen, Donna Baumbach,Ronald J. Bula, Madelyn R. Callahan, Deborah Clavin, JoeCooney, Yvonne Ferguson, Howard Harris, Ronald Hilton, DavidHolmes, Donna E. Johnson, Edward V. Jones, Russell Kratz, JeanLowe, Frances Melange, Donald L. Martin, Sandy McGechaen,Norma Milanovich, Audni Miller-Beach, Donald Mocker, ChristaOxford, William Reese, Rick Schau, Steven E. Sorg, Michael A.Spewock, Neal Wiggin, and James L. Wright.

Recognition for major individual roles in the development of thesematerials is extended to the following National Center staff. HarryN. Drier, Associate Director, Development Division, and RobertE. Norton, Program Director, to. leadership and direction of theproject, Lois G. Harrington, Proyam Associate. for training ofmodule writers, assistance in the conceptualization and develop-ment of the materials, and maintenance of quality control, DavidJ. Kalamas, GraduL to Research Associate, for development ofillustration specifications; Susan Dziura, for inttial art work; andShellie Tremaine and Cheryl Salyers, for their word processing

Special recognition is also extended to the staff at AAVIM for theirinvaluable contributions to the quality of the final printed products,particularly to Sylvia Conine for typesetting, to Marilyn MacMillanfor module layout, design, and final art work, and to George W.Smith, Jr for supervision of the module production process

THE NATIONAL CENTER

FOR RESEARCH IN VOCATIONAL EDUCATIONTHEP'60 l'Elt?Nlaaou':'N.X4us o.c 43210

The National Center for Research in Vocational Education's mis-sion is to increase the, ability of diverse agencies, institutions, andorganizations to solve educational problems relating to individualcareer planning, preparation, and progression The National Centerfulfills its mission try:

Generating knowledge through research.I> Developing educational programs and products.

Evaluating individual program needs and outcomes.Providing information for national planning and policy.Installing educational programs and products.Operating information systems and service::Conducting leadership development and training programs.

--"A- AVIM

AMERICAN ASSOCIATIONFOR VOCATIONALINSTRUCTIONAL MATERIALSThe National Institute for Instructional Materials120 Driftmier Engineering CenterAthens, Georgia 30602

The Amencan Association for Vocational Instructional Matenals(AAVIM) is a nonprofit national institute.

The institute is a cooperative effort of universities, colleges anddivisions of vocational and technical educaho,. in the United Statesand Canada to provide for excellence in instructional matenals.

Direction is given by a representative from each of tie states.provinces and terntones. AAVIM also waits closely with teacherorganizations, government agencies and industry.

spa

Page 4: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

I

... 0 ',..-(4.',"ke

, -,..t.'" , ,. 4' , - -,,,` '4'

44 .",4-4.

" 44.4 :r.4

c";,,s, 4-, i"'":".:;;.,,,4`.,

$13',-,.. '',:, ir Z-1.

,,,,, - ,

4i '?.% 7-444. 4';f1:$4C'4-4'4,,

424',

tf.

e4_

S

ft

J

Page 5: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

INTRODUCTIONEvaluation is an integral part of the teaching/learn-

ing process. It staits when you determine the train-ing needs of your stu... -nts (Module N-3) and developinstructional objectives designed to meet those train-ing needs (Module N-4). This is the preassessmontand planning phase of instruction.

In this module, the focus is on evaluating learners'progress in meeting the specified objectives. Stu-dent progress needs to be evaluated frequently soboth you and your students know what they have ac-complished and what still needs to be done. Studentprogress also needs to be evaluated frequently sostudents receive the positive feedback needed tostay motivated.

Through postassessmentend-of-course evalu-ationyou can verify that all objectives have (orhave not) been met according to specified standards.

Preassessment gives you a starting referencepoint. You can measure "progress" because youknow where your students were when they started.Performance objectives give you an endingreference point. You know whether students haveaccomplished what they set out to accomplishbecause their objectives have been clearly stated,in observable terms.

In short, through an effective evaluation pro-cedure, you can answer a number of essential in-structional questions: What is being learned? Howwell is it being learned? How much is beinglearned? How effective are the learningexperiences?

Such questions need to be answered regardlessof the ages of the students involved. Adults tend tobe utility-oriented learners who bring specific needs,objectives, and goals to the learning situation. It is

therefore important not c nly that the r needs be met,but that they have tangible proof of the dividendsreceived from their investment of time and energyin the learning program. There are a variety ofevaluation strategies that can help provide this proof.

The benefits of employing an effective evaluationprocedure are at least fourfold. (1) you gain impor-tant information on learners' progress and can makeappropriate adjustments to your instructional plans,(2) learners obtain information about their achieve-ment in the program. (3) learners develop skills inevaluating their own work more objectively, thusgaining a valuable tool for future use, and (4) deci-sion maker.; secure information they need to makedecisions about whether to continue, restructure, ordiscontinue particular programs.

In this module, basic information on evaluationis provided in Learning Experience I. Information ontestinghow to develop tests, the uses of testresults, and grading is addressed in Learning Ex-perience II. Evaluation strategies that are particularlyappropriate for use with adult learners are dis-cussed in Learning Experience III.

There is a logical reason for this particular orga-nizationfor not discussing evaluation and testingin terms of adult learners throughout the module.New instructors need basic information about evalu-ation and testing, as well as infcrmation on how toevaluate adult learners. Experienced instructors ofyounger students, who already have skill in evaluat-ing student performance, need only the adult-specific information. Thus, the module is organizedto accommodate, as efficiently as possible, theneeds of both groupsto allow each to develop thespecific skills they need without having to completeactivities covering skills they already possess.

Sv

3 5

Page 6: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

=NMEINIIIMMIR-16", 710,1511191111IMPIIMMIVANIIIIIL

ABOUT THIS MODULEObjectives

rlrminyobjepthisimri,fitiaatilaitteachin-g-!'di*ation,,evaluateThqfperformanceot acultsfryouripe, .ormance.,Alrbegissessed, yqoupresource,persom.mstTeachttr?Perjormlincev assessment; ftirm;..;pp.,',6,$767'geitfiinggxperiencal

Enabling Objectives:1. After completing the required reading, demonstrate

knowledge of the rationale for and major factors in-volved in student evaluation (Learning Experience I)

2. After completing the required reading, critique a test(Learning Experience II).

3. After completing the required reading, critique the per-formance of an instructor in a given case study indeveloping evaluation plans for use with adult learners(Learning Experience III).

PrerequisitesTo complete this module, you must have knowledge of thecharacteristics of adult learners and the process of adultdevelopment. If you do riot already meet this requirement,meet with your resource person to determine what methodyou will use to do so. One option is to complete the infor-mation and practice activities in the following module:

Prepare to Work with Adult Learners, Module N-1

ResourcesA list of the outside resources that supplement those con-tained within the module follows. Check with your resourceperson (1) to determine the availability and the locationof these resources, (2) to locate additional references inyour occupational specialty, and (3) to get assistance insetting up activities with peers or observations of skilledteachers, if necessary. Your resource person may also becontacted if you have any difficulty with directions or inassessing your progress at any time

Learning Experience IOptional

Reference: Morris, Lynn Lyons, and Fitz-Gibbon,Carol Taylor. How to Measure Achievement. BeverlyHills, CA: Sage Publications, 1978.Reference: Popham, W. James. Modern EducationalMeasurement. Englewood Cliffs, NJ: Prentice-Hall,1981.

Reference: Gronlund, Norman E. Measurement andEvaluation in Teaching. Fifth Edition. New York, NY.Macmillan Publishing Co., 1985.Reference. Kirkpatrick, Donald L. How to Improve Per-formance Through Appraisal and Coaching. New York,NY: American Management Association, 1982.References: The National Center for Research inVocational Education. Professional Teacher Educa-tion Module Series; Modules D-1, D-5, and K-3.Athens, GA: American Association for Vocational In-structional Materials, 1984-86.

Learning Experience iiOptional

Refer( Vice. Denova, Charles C. Test Construction forTraining Evaluation. Madison, WI. American Societyfor Training and Development, 1979.Reference: Mager, Robert F., and Beach, Kenneth M.,Jr. Developing Vocational Instruction. Belmont, CA:Pitman Learning, 1967.Reference. The National Center for Research in Voca-tional Education. Professional Teacher EducationModule Series; Modules D-2, D-3, D-4, K-4, and K-5Athens, GA: American Association for Vocational In-structional Materials, 1983-86.

Learning Experience IIIOptional

Reference: Knowles, Malcom S. The Modern Practiceof Adult Education: From Pedagogy to Andragogy.Revised Edition. Chicago, IL: Follett Publishing Co.,1980.

Reference. Knox, Alan B. Adult Development andLearning. A Handbook on Individual Growth and Com-petence in the Adult Years for Education an 4 the HelpingProfessions. Higher Education Series. San Francisco,CA: Jossey-Bass, 1977.

Learning Experience IVRequired

An actual teaching situation in which you can evaluatethe performance of adults.A resource person to assess your competency inevaluating the performance of adults.

General InformationFor information about the general organization of eachperformance-based teacher education (PBTE) module,general procedures tor its use, and terminology tat iscommon to all the modules, see About Using the NationalCenter's PBTE Modules on the inside back cover. Formore in-depth information on how to use the modules inteacher/trainer education programs, you may wish to referto three related documents:

The Student Guide to Using Performance-Based TeacherEducation Materials is designed to help orient preservice andinservice teachers and occupational trainers to PBTE ingeneral and to the PBTE materials.

The Resource Person Guide to Using Performance-BasedTeacher Education Materials can help prospective resourcepersons to guide and assist preservice and inservice teachersand occupational trainers in the development of professionalteaching competencies through use of the PBTE modules.It also includes lists of all the module competencies, as wellas a listing of the supplementary resources and the addresseswhere they can be obtained.

The Guide to the Implementation of Performance-BasedTeacher Education is designed to help those who will admin-ister the PBTE program. It contains answers to implementa-tion questions, possible solutions to problems, and alternativecourses of action.

Page 7: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Learning Experience I

OVERVIEW

ng demonstf atekfciiktedOcifthoraticitiale-

9r17,1qtliin*tiSthStqct,entiilatuAtiO)ii,p0.

ionilitiintrortilorept.theidlchirjngsUP.,itz,-Gibbarol:loW4o.-AceatilrO:Achfeit,e,

"t c to:mptoirePerfOtineOce'

sat _.111:40:,con:4,iittin,,,gp;s6`

ojahe0 on or M I ID Establish

'2.

cifittl'elAtFopatelorapd.inajO,faOtor6,0410100-640.:801f-Ch'ecic,pp..:02424;

atitikt$440:9r9101pd"..0,.ew,

5

°,

Page 8: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Effective evaluation is an essential element in adult learning. It can provideboth you and the learners with valuable information about learners' skills,achievements, and instructional needs. This knowledge is essential to yourability to develop instructional plans that will continue to meet students' in-dividual needs. For information on why, when, and how to evaluate, how tograde, and how to use evaluation results, read the following information sheet.

STUDENT EVALUATIONEvaluation is a common and natural human activ-

ity. We all do it regularly, whether we are consciouslyusing rigorous and clearly defined standards or tak-ing a more casual and intuitive approach to makingvalue judgments about daily situations.

Regardless of how we go about it, the process ofevaluation acts as both a scale and a compass. Itacts as a scale in that we measure the value of ourgains and losses. It acts as a compass in the sensethat wk, take a directional reading to determine wherewe stand in relation to the things that we value most.

If the evaluation process is accurate and we useit properly, we gain valuable information that canhelp guide our course of action. And if that courseof action leads to meeting our objectives, it canrenew our confidence in our ability to succeed andencourage us to continue our efforts.

If the process is inaccurate or inappropriatelyused, the negative effects are also felt. These effectsare, at their best, demoralizing; and at their worst,loaded with damaging consequences

Effective evaluation thus depends not only ondeveloping accurate assessment procedures, but onacquiring the skills necessary for using them wellYou need to devise good data-collection techniques,which will yield good data, and you need to be ableto make sound judgments in analyzing and usingthose data for instructional purposes.

6

Why Evaluate?Many adults come to a learning situation with

specific needs that are critical to their professionalor personal success. Because the time and energythat they can invest is often limited, they want tan-gible evidence that their investment is well-placed.Information that provides learners with a view of theirstrengths, achievements, and the direction in whichtheir work is heading gives them that evidence.

Learners need this evidence not only in formallearning situations, but in independent learning activi-ties as well. Experience with a variety of assessmenttechniques ;.an enable 'earners to develop effectiveand realistic approaches to assessment that theycan use on their own.

The benefits of evaluation extend well beyond pro-viding learners with this type of evidence. Whenstudents of any age obtain immediate, frequent, andpositive feedback about their progress, they con-sistently learn more, retain more, and are moremotivated to continue with their learning activities.

As an instructor, you also will need the informa-tion that cail be obtained from assessment activities.It can help you identify learners who are ready tomove on to new learning activities and those whoneed additional help in certain areas. It can also helpyou determine what changes you need to make inyour instructional plans in order to meet individualneeds.

Bear in mind that you, too, need the type ofpositive reinforcement that learners gain from know-ing what they've achieved. This comes from seeingthe evidence thE.t students are acquiring neededskills and knowledge and from knowing that you areproviding effective instruction to help make ithappen.

A final reason to evaluate is that your ownemployer is likely to require that you evaluate andreport on learners' progress at specified times.Education is expensive, and cost factors must beconsidered. Businesses and industries operatingtraining programs are interested in the "bottomline". Is their investment paying off? !s the trainingprogram cost-effective? Are trainees learning the

8

Page 9: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

1

1

i

1

skills they need in ao efficient manner? Schools andcolleges, too, are operating more and more asbusinesses, with accountability concerns and acost/benefit focus.

Student performance thus may have a great im-pact on the fate of a given program. If students ina particular program are not meeting the specific ob-jectives, then the usefulness of that program is inquestion. Does the program need to be modifiedsignificantly? Should it be eliminated? Is the problemnot in the program, but in the instruction? In short,program evaluation and teacher evaluation aredirectly related to student learning and success.

Evaluation TerminologyBecause evaluators use some terms in very

specialized ways, let's clarify our use of terminologybefore going any further.

Evaluation. Evaluation is a process by which you(1) gather information about the quality of theteaching/learning process and the achievement ofobjectives and (2) use that information to makeeducational decisions and improve the instructionalprocess.

Assessment. Assessment is the use of specifictechniques to gather that information. Thus,theoretically, you can assess (gather data) withoutevaluatingwithout analyzing and making use ofthose data. When a student gets an F on a unit testand then goes on to the next unit without havingmastered the previous unit, that's assessmentwithout evaluation.

Ideally, however, assessment results shouldalways be used not only to IL entify the amount ofstudent progress and level of student achievement,but also to remedy any problems identified. It is byusing assessment results that you can best servethose students presently enrolled in your program,as well as future students.

Testing. Testing is not a synonym for assessment;it is one type of assessment. The evaluation processcan and should include a variety of assessment tech-niques: standardized tests and teacher-made tests;objective tests and subjective tests; tests of knowl-edge, skills, and attitudes, interviews and discussions,observations, and self-reportsjus to mention afew. The assessment techniques can be formal (e.g.,a structured test) or informal (e.g., a chat with a stu-dent who appears to be having trouble completinga lab assignment).

Performance. When training and developmentpeople use the term performance, they generallymean "behavior on the job." Thus, evaluation of per-formance would mean evaluation of on-the-job perfor-mance. This is not how we are using the term in thismodule.

When we say in the title of the module, Evaluatetf:a Performance of Adults, we are using the term per-formance as it is used in everyday English. It meansthe evaluation of learners' accomplishments in theprogram. What do they know? What attitudes dothey possess? What skills have they mastered? Inperforming skills, do the products they produceand/or the processes they employ meet establishedcriteria?

Depending on the instructional setting, skills maybe assessed in the lab, on the job, or both. You mayassess just knowledgethrough a paper-and-penciltest, for example. You may assess just attitudesthrough an attitude scale, for example. Or you mayassess knowledge, skills, and attitudes simulta-neously using a performance test. In the process ofperforming a skill, the learner demonstrates that heor she knows the procedures to be followed, that heor she possesses such desirable attitudes as safety-consciousness, and that he or she can perform theskill and produce the product according to specifiedoccupational standards.

Student evaluation vs. program evaluation. Thefocus of this module is on student evaluation: gather-ing information about student achievement so thatyou know, and they know, how much they have pro-gressed since they entered the program and how farthey have to go to meet the agreed-upon objectives.But, as mentioned previously, such information alsohelps you as an instructor to improve your programEvaluation results show you where things are notworking, which allows you to restructure your plansso they will work. In addition to making use of stu-dent evaluation data for program improvement pur-poses, most instructors also use specific programevaluation techniques . . . but that is not the concernof this module.1

When to EvaluateInstitutional requirements alone should not aeter-

mine your schedule for evaluation activities. Your in-structional plans and the learners' need for feedbackthroughout the course will help you decide when toevaluate. Generally speaking, there are three ma-jor evaluation phases: preassessment, assessmentof progress, and postassessment.

1 To gain skill in program evaluation, you may wish to refer to ModuleA11. Evaluate Your Vocational Program. and Module A.10, Conduct a Stu-dent FollowUp Study. The use cf follow-up surveys in adult programs iscovered in Module N2. Market the Adult Education Program

Page 10: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Preassessment involves evaluating learners' skillsand knowledge in a specific area just before or atthe beginning of a program (or course or learningactivity). It enables you to verify student placement,identify individual training needs, and determine theappropriate level at which to present new learningexperiences. The results of preassessment also pro-vide a reference point for measuring the progresslearners have made throughout the program.

In many educational institutions and training pro-grams, preassessment is a regular part of the enroll-ment and admissions (intake) process. In addition,you as an instructor need to ask learners themselvesabout their skills, knowledge, and objectives in theoccupational area. Learners are often the most directsource of information about themselves, and theycan provide insights and information that may haveslipped through the cracks of the intake process.

Another preassessrnent method is to test studentson the specific content of the program. Theoretically,the results can be most useful. By testing studentsdirectly on the theory, Skills, and terminology of theprogram up front, you get a very accurate sense ofwhere they are. It can be a reality check for students,too. This type of preassessment is particularly usefulin making pre/postassessment comparisons to iden-tify precisely what and how much learning hasoccurred.

The disadvantage of this latter type of preassess-ment is that it can start students off with a negativeexperience. Presumably students are in the programbecause it is an area in which they lack skill orknowledge. That's a given. If you start by focusingon what they don't know, rather than on some suc-cess experiences, it is not particularly motivational.Thus, if you choose to use such tests, you need toadminister them in a positive contextfocusing onthe usefulness of the results (1) in identifying wherestudents are already strong and (2) in providing theinformation needed to plan relevant instructional ex-periences for the specific members of the class.

Assessment of student progress should be anongoing process and should occur at regular timesthroughout the program. It is an essential part of ef-fective learning. Not only do learners gain valuableknowledge about their achievements, but experiencewith assessment activities can help them build self-evaluation skills that they can use in years to come.As the instructor, you need the information gainedfrom ongoing assessment in order to provide effec-tive learning experiences.

Information from both formal and informalassessments can also help learners identify specificweaknesses they might not otherwise discover untilthe end of ',he program. At the end of the program,it might be too late to acquire Important skills thatthey need to meet their objectives. Obtaining feed-back during the program gives learners a chance todevelop, practice, and build on these skills.

You should informally assess student work fre-quently and provide students with immediate feed-back about their progress. More-formal assessments(e.g., tests) should be made on a regular basis. Asa general rule, the most valid and reiiable results areprovided by tests administered any time from severaldays to two wesks after skills and knowledge havefirst been learned. Your teaching situationthelength and type of the program, the instructionalplans you have made, and the nature of the learners'activitieswill help you determine the best times toconduct progress assessments.

Postassessment occurs at the end of a programor segment of a program. In a conventional time-based program, for example, exams may be givenat the end of each unit and at the end of the term.Theoretically, the results of final exams sum up whatstudents have learned since the beginning of theterm (i.e., since preassessment). Postassessmentresults often play a large part in determining a stu-,..ent's tinal grade for the term. They also should beused as a basis for improving instruction the nexttime the course or program is offered.

In competency-based education (CBE) programs inwhich learners continue to work on each occupationalcompetency until they have reached proficiency, post-assessments generally occur more frequently. Whena student feels he or she he . mastered a compe-tency, he or she performs that skill for the instructor,who rates the student's skill level using specificoccupational criteria. If the student performs success-fully, he or she moves on to the next competency.In such programs, the student may leave the pro-gram, not with a final grade (e.g., A-F), but with acompetency profile showing the ratings he or shereceived on each individual competency mastered(see sample 1).

Another form of postassessment may be a part ofyour instructional program. In some occupationalareas, such as cosmetology ano nur:;ing, studentsmust pass a licensing exam before catering the oc-cupation. As with other assessment efforts, youshould make use of the results of licensing exams.The results of such exams give a very clear messageto you and your students about the adequacy of yourinstruction and the adequacy of their efforts.

8 10

1

Page 11: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

CO

SAMPLE 1

*OilpOssfully,:tattire* VIO?i:0114kt.OcVPPIIIPOteitieln

SOURCE: Arkansas Department of Education. Division of Vocational and Technical Education

11

Page 12: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

3

...ROCESSI

Page 13: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

.0 1 2 3,Ae.;Understancis the; function of three

.categoriecolSoftvitriet.. 0 0 0 b-

3perátlng system OUDOVuirrionsuiteitprofiClenqi.,:ii.lteit=", ,

4"r"r DDDerstands and uses the folIong

PERSONAL QUALITIES/PERFORMANCEDirections The following Information reflects the instructor'sOPiniori:'CiteCiittiriaPproPriate letter tti Indicate your personaleVaitiatiOn of the students pers:ona1;4iiatitiesiperfOrrriitrice.

O Not Observed S Satisfactory49. t 1;i.Excellant-

pC,DATA,OROOESSthiG:ERlIC6'tatice of pro-

*,,teCtingdats,fileS,itniiInforiaatiao with:integrity and.corifidentlalitt: 13'ET,E1-0

`-.= ,1.1rttierstands the fmportance of sys-

^ *-.;, -,:,7,:'`,,, , ... ,,,,- ' .::::! , : , ,

':: ':Alehriit tiricizoi;Conunandrircs 1;1 cr o 17..E1 CI :D.'- DemontitrateijfioneMyln , prciduc- .

. - ,..... , .

qC3:',..DICI

i;',;-, -.t.,......"c- -.`'. -, ---,_

O.r1inciiiiitancielhOMPOrtance,of fol

r:0;13-a

.Z-47,1ZATcAtgifitigrXqa--1.21.1...-4,"

"Xg,F-Igt, i-,rE

It

'01:3 b

.".-kaioviipolpolicies

tpr:reas:e.,114-.--:::::P111.'ii;a:;f9r'

:,;,Derneastmtatitdependabiltty.

..,..4..-,!"(tH:;.!!!in'Srt'..arkAtittusiSiii

Clive tritictis

._ ,o ,-thmicuurts

.,Deinonstritiatrpuncttiality

r.,, ijthibite'pridern'workrilanship

0befet 14eiiateiy rules'

.:.'..Shows,pot:eriiral foradvance-

,

mtaiiiittentS,

0 0 Functions cooperatively with

To EMPL" concontactingCtatils the kee4-,'--.4")eleitcyyrefile,1114Y be

J.;

5 16

Page 14: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Evaluation Strategies

There are many different evaluation strategiesand, as an instructor, you will use a variety ofstrategies throughout the course. In simplified terms,these strategies fall into two basic categories. infor-mal and formal.

Informal EvaluationDuring the program, you will find that you are most

often involved in informal evaluation activities. Thesecan include (1) observing students as they learn andperform various tasks, (2) talking to individuallearners about their plans and achievements, (3)helping learners find solutions to problems that theyhave, and (4) setting and maintaining an open-doorpolicy that encourages learners to approach you forfeedback. All these activities are crucial to the learn-ing process.

Why are these activities essential to effectivelearning? Learners are not always able to accurate-ly self-evaluate, and they need constant, positivefeedback throughout the course. Lack of experiencewith the subject matter, lack of self-confidence, orthe inability to always be objective about oneself allcontribute to a learner's need for feedback from aninstructor. In each step of the learning process,iearners need to know that they have achieved orcan achieve their objectives.

One of your major roles is to provide learners withpositive feedback and helpful information about theirwork. Adult learners need immediate information thatconfirms that their work is "correct" and headed inthe right direction. Praise for the sake of praise isnot the purpose of positive feedback. It must begiven for specific achievements, otherwise, it isgenerally meaningless and does little to help developlearning.

No matter how much learners have achieved, youshould consistently point out and make positive com-ments on each learner's achievements. Whetherthey are learning quickly or slowly, all students needthe encouragement and motivation they gain whenyou identify and respond positively to theirachievements. Do not neglect "small" successeseach of these accomplishments builds toward thefinal objective and should be recognized.

Likewise, learners need your expert help in iden-tifying their mistakes and finding ways in which tocorrect them. But remember, this is useful only if youprovide the learner with information or help in solv-ing the problem. Simply pointing out mistakes serves

no good purpose and ultimately can discouragelearners altogether. When learners are having prob-lems, you need to ask them questions that will helpthem identify (1) the specific cause of the problemand (2) possible solutions. When you approach dif-ficulties in this way, you not only help learners withthe immediate situation, but you also help them learnhow to solve problems independently.

Written comments on projects, papers, and testsare another valuable source of feedback. The sameprinciples that govern effective oral feedback applyto written comments. Make sure that you respondto specific aspects of the learners' work, as well asto the overall product.

Formal EvaluationFormal evaluation involves the use of evaluation

instrumerts and grading systems. Using these in-struments and systems is generally not extreme:ydifficult, but you will need to understand some basicprinciples, steps, and guidelines in order to use themfairly and effectively. For this reason, the remainderof this information sheet and the one in Learning Ex-perience II focus on developing effective evaluationinstruments, designing good tests, and using theresults effectively.

Two major approaches to testing are used in mostadult occupational programs. written tests and per-formance tests. Paper-and-pencil or written testsconsist of objective items, subjective items, or a com-bination of the two. Objective items are those inwhich the correct answer for a problem will alwaysbe the same, no matter who scores the test. Multiple-choice, matching, and true-false items are familiarexamples. Subjective items (e.g., essay items andcase studies) require the person scoring the test tointerpret or make a judgment.

1 7

i

Page 15: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Whereas written tests are used to evaluateknowledge, performance tests require the studentto demonstrate skill in a particular task or set oftasks. The instructor observes the process or prod-uct (or both) of the learner's performance and ratesthe student's proficiency according to specifiedcriteria.

When and how you administer tests will dependon a number of factors. Some of these factors, suchas the principles and steps involved in developinga good test, will be fundamentally the same in eachsituation. Other factors will vary depending on whereyou are teaching and who your learners are. Let'slook at the variable factors that will affect the evalua-tion strategies you use.

Institutional factors. Learners may need to gainspecific knowledge, attitudes, and skills (competen-cies) in your program in order to obtain academiccredit, a license, a certificate, or a degree. Todemonstrate that they have met these requirement:,learners may have to pass standardized testsdeveloped locally or by a testing service. If so, youwill need to identify the specific requirements so thatyou can develop learning experiences that will helplearners meet those requirements and prepare forthese tests. At the beginning of the program, youshould also notify students of the expectations andrequirements of the institution.

In other situations, the educational institution maynot provide standat-dized tests but may require youto develop teacher-made tests based on explicitlystated instructional objectives. These tests are in-tended to provide proot to administrators, accredita-tion agencies, prospective employers, and studentsthat learners have acquired specific skills andknowledge in your program. If you are teaching inan industrial traring program, the firm that has hiredyou may also require this form of proof.

You will need to be aware of additional institutionalpolicies regarding evaluation. For example, you mayhave to submit grades or other types of progressreports throughout the program. You should con-tact program administrators in order to answer thefollowing questions:

What kind of evaluation reports must you sub-mit, if any?What policies govern the release of studentevaluation data to others?Are standardized forms provided?What kind of grading system, if any, is used?Letter grades? Numerical ratings? Percentages?What levels of achievement do these gradesrepresent?What additional evaluation procedures does theinstitution require?

13

What concerns regarding evaluation seem ofspecial importance to the institution (e.g.,dropout rates, grade inflation)?

Psychological factors. There are a number ofpsychological factors that affect how well a personwill perform during evaluation. Anxiety, intimidation,frustration, fear of failure, and physical discomfortcan all negatively affect a learner's performance.When this happens, the evaluation results don't trulyreflect a person's capabilities or knowledge.

Many adult learners who are returning for the firsttime to educational programs have negativememories and feelings associated with the idea ofquizzes, tests, and examinations. For a variety ofreasons, they may not have performed well on testsin the past. Right or wrong, they may still feel thatwhen they take tests, they are victims of an arbitrarysystem, whichwhen the results are inseals theirfate.

As a consequence, the way in which you presentevaluation activities is an important part of theevaluation process. In your role as an instructor, youcan help generate an atmosphere that says thatlearners are not helpless victims. By providingpositive and informative feedback throughout thecourse, you can do much to help prepare learnersfor tests, both psychologically and in the subjectarea.

You should emphasize that tests and other evalua-tions are not designed to be a vehicle for pigeon-holing students or their learning experiences as"successes" or "failures." Rather, tests provide ameans for (1) determining where students are in rela-tion to meeting their objectives and (2) identifyingareas that need to be strengthened in order forthem to meet those objectives. Tests are learningdevices. You should also let learners know that, inaddition to measuring their abilities, test resultsreflect the quality of the learning experiences thatyou have designed.

You can do much to help learners become morecomfortable with evaluation if you openlyacknowledge their possible anxiety. For example,discussing such thoughts as the following may behelpful:

That they may be experiencing some anxietyThat some levels of anxiety and apprehensionare common and normalThat they are not alone in these feelingsThat anxiety about doing poorly does notnecessarily foreshadow poor results

Many instructors have found that group discussionsabout testing and evaluation in general can helprelieve undue anxiety and dispel self-doubts thatlearners may have.

18

Page 16: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

In addition, when learners have their first test com-ing up, you can offer tips on how best to study forand complete an exam, including the following.

How to space out studying, rather than cram-mingHow to determine which topics to focus onHow to approach the test (e.g., importance ofreading and following directions, writing legibly,reviewing responses for accuracy)How to determine how much time to spend oneach test item (e.g., knowing to skip an item ifthey do not know the answer immediately andthen to return to it after they have finished therest of the test)How to use the test results as a learning tool

Learners' skill and knowledge levels. How well-developed are your learners' basic communicationskills? This is an important consideration. A technicalknowledge test that requires extensive use of theseskills will not yield valid results if learners' basic skillsare not well developed. Students may know the in-formation being tested very well, but if they areunable to understand and respond to the test itemsas presented, their test results will be poor.

Students with poor listening and speaking skillsmay have difficulty demonstrating the true extent oftheir technical knowledge on an oral test. Studentswith poor reading and writing skills may have difficul-ty demonstrating the true extent of their technicalknowledge on a written test. You will need to makesure, tnerefore, that the test items jou use corres-pond to learners' abilities to understand andrespond.

This is not to say that the development and evalua-tion of students' communication skills may not bea crucial part of your instructional program. It sim-ply means that when you want to test technicalknowledge, you should ensure that you do sothatyou do not allow other factors, such as weak com-munication skills, to cloud your results.

Special needs. Physical impairments may alsocause poor test results that don't truly reflect alearner's knowledge or abilities. For example, aphysically handicapped person might need betterwheelchair access to machinery or work areas inorder to perform some tasks r. operly. A visually im-paired student might have difficulty reading testitems in standard-size print. In such cases, youwould need to modify the evaluation process in orderto evaluate students fairly.2

2 To gain more skil in evaluating the performance of learners who havespecial needs, you may wish to refer to module L.9, Assess the Progressof Exceptional Students.

Individual needs and objectives. Ideally, theprimary function of evaluation should be to measurehow well individuals are meeting their own learningobjectives and gaining skills and knowledge that willhelp them meet their own overall goals. This idealsituation does not often exist, however. Other factors(e.g., institutionally established learning objectives,professional requirements, industrial standards) af-fect the functions evaluation must serve. Even so,adult learners need feedback that directly relates totheir individual needs and goals. You should takethis into account when developing both formal andinformal evaluation activities.

Learning domains. For practical purposes, learn-ing is often categorized into three primary areas, ordomains: knowledge (cognitive), skills (psychomotor),and attitudes (affective). Understanding the natureof these domains and their relationship to differenttypes of learning activities can help you determineexactly what type of learning you will evaluate andthe most effective way to do it. Of course, learnerperformance in these domains will overlap, but it isuseful to consider each of the three domains whendeveloping evaluation plans.

Imagine, for instance, a program on the repair andmaintenance of automobile brake systems. Beforestudents actually perform routine maintenance taskson vehicles, the 'istructor would probably want totest their knowledge of brake systems. A written orora! test could be used to test students' knowledgeand determine whether they are ready to move onto the hands-on learning experience.

c

14 19

o

l

i

1

i

i

Page 17: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

i

1

At certain points in the program, the instructorwould need to test learners' skills in various tasks.A performance test that required learners to actuallychange brake shoes, for example, would be an ap-propriate method of evaluation for that type of task.

Learners' attitudes are also an important factorin the maintenance of automobile brake systems. Forexample, students need to value and act on anumber of safety precautions related to working onautomobiles. They also need to value a properlymaintained brake system. Thus, the instructor wouldneed to evaluate students' knowledge of these safetyfactors, as well as their performance of safetymeasures in actual practice.

Student performance objectives. All the factorsdiscussed so far can play an important role in deter-mining the kinds of assessment techniques you use.But it 13 student performance objectives that actuallyform the basis for evaluation. These objectivesdefine the specific knowledge, skills, and attitudesthat learners need to acquire in your program, andthey include the criteria by which student perfor-mance should be measured.

For example, an objective might be as follows:

Given ten shafts with differing measurements,measure the diameter of each with a micrometerwithin ± .001" of the instructor's measurement.

The objective includes a condition statement (giventen shafts with differing measurements), an actionstatement (measure the diameter of each with amicrometer), and a criterion statement (within± .001" of the instructor's measurement).

Grades and Grading

The criterion statement describes the standard tobe met, and in occupational training programs, oc-cupational standards generally prevail. Studentsneed to learn to perform according to the standardsthey will have Lo meet in the job for which they aretraining.

Where your objectives come from will be deter-mined by your particular instructional situation. Theschool, college, or firm may provide you with a veryspecific list of objectives that must be met in , ourprogram. Or you may need to specify program ob-jectives yourself, based on occupational require-ments and students' training needs. Or students'individual educational and training needs may playa very large role in determining program objectives,as may be the case in a continuing education pro-gram, for example.

When the institution specifies objectives, the in-stitution should also specify the criteria to use inmeasuring whether the objectives have been met.When the objectives are derived from occupationalrequirements, occupational standards should formthe criteria to be met. When the instructional situa-tion is such that students help specify some or allof their own objectives, then they must also helpspecify the criteria to be met.

For example, an individual may enroll in a com-puter literacy course for professional developmentreasons, even though computer use is not a require-ment of his/her job. In that case, he or she needsto help determine the criteria that will define "suc-cess," based on his/her own personal obe :Alves.

Depending on your situation, you may be requiredto submit only records of final grades to program ad-ministrators, or you may have to submit more com-plete records (e.g., records of progress reports andgrades received throughout the term, copies of testsand performance checklists for each student). Youwill need to consult with administrators or super-visors where you are teaching to determine what pro-cedures and what type of grading system (if any) arerequired.

At the beginning of the program, you should in-form learners about the grading procedures andstandards that will be used. They need to know whatto expect and how it will affect their overall records.

Grading SystemsA number of grading systems are in fairly common

use. The most common is a letter grade (A, B, C,D, F) or numerically equivalent grade point (4, 3,2, 1, 0) system. Both of these systems are basedon a five-point scale, in which each letter or numberis equal to a verbal quality rating. For example:

A (or 4) = Excellent work, clearly superior perfor-mance

B (or 3) = Very good work, exceeds minimumstandards of performance

C (or 2) = Work meets standards, acceptableperformance

D (or 1) = Below standard work; barely accept-able performance

F (or 0) = Work not acceptable

15 20

Page 18: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Some institutions also provide an I (incomplete)grade category. If learners receive an I they mustcomplete their unfinished Nork in a set amount oftime before a final grade is given.

Another common grading system uses percent-ages. The following is one example of percentageranges and equivalent quality ratings:

90%-100% = Excellent80%- 89% = Very good70%- 79% = Acceptable60%- 69% = Barely acceptableBelow 60% = Not acceptable

Percentages allow you to calculate grades moreprecisely because they in'..lude a wide range of in-crements; you can report not just an A, but a low A(e.g., 90%) or a high A (e.i2 , 100%). Many instruc-tors prefer to assign percentage grades throughouta course so that when all grades are averaged, thefinal grade will more precisely reflect the student'sactual level of performance. Percentage grades canthen be converted to letter grades or numericalgrade points, if required by the institution.

Note that the relationship between letter gradesand percentages will vary among institutions. For in-stance, in some schools, 92-100 percent is equalto an A, and 84-91 percent is equal to a 8, in others,an A is equal to 90-100 percent, and a 8 equals80-89 percent. No matter what values you assignto the grades you award during your program, thegrades that you submit to administrators should cor-respond to the standards and values they haveestablished.

Some instructors and educational institutions usean S/U grading system. In order to receive an S(satisfactory) grade, a learner's performance mustmeet or exceed specified minimum standards. A U(unsatisfactory) grade reflects work that does notmeet the minimum standards. Pass/fail grades aresimilar in principle to S/U grades.

There is some controversy over the value of S/Uand pass/fail systems. Some educators believe thatthese systems are unfair and unrealistic becausethey cannot truly reflect, except at the most fun-damental level, the quality of a learner's work.Others argue that these systems are superior toother systems because they deliver the essential in-formation; that is, the learner either possesses ordoes not possess certain skills and knowledge.

16

Other institutions report progress and achieve-ment, not using grades, but using written evalua-tions, skill reports, performance checklists, orcompetency ratings (as in sample 1, pp. 9-11).Brevity and clarity are essential to developing fairand accurate reports.

Some instructors, especially in CBE programs,use grade contracts. They have learners contractfor specific grades in each learning experienceand/or for the complete program. This method is par-ticularly appropriate for use with adult learners.Adults will generally appreciate, respect, and com-mit thermalves to a glade contract. This approachrecognizes their autonomy and ability t- choose theappropriate level and amount of work for tnemselves.Sample 2 provides additional detail about contractgrading. Sample 3 is an Example of a varial..Nle con-tract for determining grades.

If you are teaching in a situation where grades arenot an issue, you may simply plan regular one-to-one conferences and group discussions to providelearners with insights and constructive feedbackabout their work.

Grade ConversionIf you must submit final grades to the institution,

you will need to convert all grades earned during theprogram into a single grade. This grade shouldreflect each learner's total achievement. In order toconvert grades, you must first establish the relativevalue, or importance, of each test or graced learn-ing activity.

1111

21

Page 19: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

If you figure that the total maximum performancelevel for your program is 100 percent each test orgraded learning activ,ty then constitutes a percent-age of the total possible grade For example, gradesin one program might he weighted as follows:

60% - Six in-ciast projects 3 10% each20% - Two written papers @ 100/c each100/0 - Two take-home assignments 5% each5% - Mid-term examination5% - Final examination

1000 = Total possible grade

Of course, in every learning situation, the relativevalue of assignments will vary. The most Tipor antconcern is that the items be weighted in a way thatreflects the true value oi each individual activity inrelation to the whole learning experience. If youare teaching in w situation in which major learningobjectives and projects can be individualized,learners may have some options about how gradesare weighted. Regardless of how the relative valueof ,earning experiences is established, however,learners need to be made aware of these values atthe beginning of the program.

Using the established values, you can then con-vert each learner's scores to a final grade. Sample4 uses the preceding weighted grade values todemonstrate how an instructor converted onelearner's grades to a final grade. The column on theleft shows the learner's grades. The column on theright shows the instructor's conversion calculations(with totals rounded off to the nearest tenth). The

17

subtotals on the right are ackied to determine thefinal grade for the course. If necessary, the finalpercentage grade could be converted to a lettergrade based on established letter grade/percentageequivalents.

Grade Records!t is vital that you keep accurate and up-to-date

records of all grades for each learner. These recordsserve several purposes. You will refer to them forinformation about learner progress andachievementinformation you need to assist in-dividual learners and plan relevant instruction for theclass. Your records provide the documentation youneed to calculate final grades for reporting purposesand to justify decisions made concerning the award-ing of credits, licenses, or degrees. These recordscan also be useful when students or former studentsask you to write letters of recommendation for they

The actual system that you choose for recordingand storing grades is, to some extent, a matter ofpersonal preference. The "best" method will be onethat ,works for you. Many educational institutions pro-vide instructors with record books designed ex-pressly for keeping grades. Some instructors preferto use other materials, such as index-card files oriscse-leaf binders, in designing a record-keepingsystem for their grades.

Regardless of the record-keeping system that youchoose, it is vital that you keep these records in asafe place to ensure learners' privacy.

2

Page 20: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

SAMPLE 2

f,thegtedintperiody tien yOin

,

depidevpetherir

p opiate`for adult' students:.type of contyact grading

1,1*(1' 1.0)03107, 080 a.

"lit{t(OP,110t

inv#08A10110-'4.00.00,40*

allovv4111,1i0e

lief=en examine='

POO,'

alb*.`sfudents,canwredo`:s

entete,efirial=

SOURCE Excerpted from Module 0.5, De.-:mine Student Godes, p 12

1823

Page 21: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

SAMPLE 3

SOURCE: Richard Gustafson, Montgomery County Joint Vocational School. Clayton, Ohio

2419

7

Page 22: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

SAMPLE 4

Instructor s,.Grae °Calculations

PA1060*e#1*---5*.:x ;0%

89 x .05, = 4.5%

LEARNER'S FINAL GRADE FOR COURSE = 90.0%

Page 23: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

i

si

Using Evaluation Results

No evaluation process is really useful unless theresults of assessment are reviewed, analyzed, andincorporated into plans for future learning. Conse-quently, it is important that you do more than sim-ply mark and record learners' grades, return thegraded tests and assignments to individuals, andmove on to new learning experiences as originallyplanned.

Both you and the learners need to review assess-ment results and identify the following:

Individual and group achievementsNew learning experiences for which learnersare now preparedWays that newly acquired skills and knowlet, ,can be used in upcoming learning experiencesSkills or knowledge areas that need to bestrengthenedResources (including the learners' strengths),techniques, and activities that learners can useto reach competence in their weaker areasChanges that need to be made in the instruc-tional plans in order to help learners meet theirobjectives

It is important to remember that positive evalua-tion results don't just point to learners' successes.They also indicate your success in enabling othersto meet their objectives. You should take time gorecognize when your own work is well donewhenyou have provided effective instruction and an en-vironment that is conducive to learning.

Remember, too, that the causes of poor evalua-tion results (i.e., low grades) need to be identifiedbefore they can be addressed. These results can in-dicate a variety of things. For example:

21

Learners' nonreadiness for evaluationPeople learn at different ratesthose who dopoorly one week may be prepared by the next.All tests do not have to be given at the sametin e. The objective of a program should belearningnot test taking at a fixed time.

* Micassessment of learners' trainingneedsWhen learners' training needs andskills are not identified (or not identified ac-curately) during preassesment, the result maybe poor pe,formance on evaluations. For exam-ple, some students may need remedial instruc-tion in certain areas. It is important *o discoverthis as early as possible, so that individuals cangain the skills they need to participate in thelearning experiences.Weaknesses in the instructional planPerhaps learning activities began at too high ortr.n low a level, given what learners alreadyknew and could do. Instruction may need to bemodified or paced dirrerently. Bear in mind thatwhen activities are paced too slow!y, somestudents will become bored and inattentive.Thus, they may do as poorly on tests as learnerswho find that learning o;eriences are movingtoo quickly.

Unrelated problemsSometimes problemsnot related to instruction affect evaluation. Forexample, iliness of a relative, unemployment,marital difficulties, and a myriad of other fac-tors (including an instructor's negative attitudetoward the course or lack of genuine concernfor the students) can affect learners' perfor-mance.

26

Page 24: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

,v-

iOptionalActivity

Ittiiii200

1-.- --"*iOptional

Activity

3/

For more information on student evaluation and testing in genera!, you maywish to read one cr more of the following supplementary references. Morrisand Fitz-Gibbon, How to Measure Achievement; Popham, Modern Educa-tional Measurement; Gronlund, Measurement and Evaluation in Teaching;and/or Kirkpatrick, How to Improve Performance Through Appraisal andCoaching.

To gain additional skill in conducting stuc' )t evaluation and grading, youmay wish to refer to one or more of the following PBTE modules:

Module D-1: Establish Student Performance CriteriaModule D-5: Determine Student GradesModule K-3: Organize Your Class and Lab to Install CBE

The following items check your comprehension of the material in the infor-mation sheet, Student Evaluation, pp. 6-21. Each of the three items requiresa short essay-type response. Please explain fully, but briefly, and make sureyou respond to all parts of eacn item.

SELF-CHECK

1. During a discussion about teaching adults, one instructor flatly states, "Adult learners should never beevaluated, and if you must assign grades because the institution requires it, you should just give oneeasy test at the end of the ..ourse." What would be your response to this instructor?

27

22

Page 25: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

2. The folIowing seven factors can affect the evaluation process. Read each factor, and describe how itmay affect the evaluation strategies you use.

Institutional factors

Psychological factors

Learners' skills and knowledge levels

Special needs

Individual needs and objectives

Learning domains

Student performance objectives

2328

Page 26: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

3. What is the best type of grading system to use?

Page 27: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Compare your written responses to the self-check ;terns with the modelanswers given below. Your responses need not exactly duplicate the modelresponses; however, you should have covered the same major points.

MODEL ANSWERS1. You might well disagree with the instructor, citing

the following as reasons in favor of evaluating theperformance of adults.

Typically, adults are utility-oriented learners whobring specific needs and objectives to a learningsituation. Their needs may be immediate, andgiven other life responsibilities, the amount oftime and energy that they can commit to a learn-ing experience may be limited.

As a result, adults want and need proof that theirneeds are indeed being metwhether they arein forma! or independent learning situations. Ef-fective evaluation activities not only providelearners with this information, but also help adultslearn to assess their own performance more ac-curately. Tangible proof of progress provides avital boost to the morale, thus, it reinforces andencourages learning.

As the instructor, you also will need the informa-tion provided through evaluation. By incor-porating assessment activities into F II phases ofthe program, you can take into account learners'strengths, weaknesses, and objectives anddesign instructioral plans that will enable learnersto meet their needs.

Furthermore, one easy test at the end of thecourse would not provide you, the learners, theinstitution, or licensing and certifying agencieswith valid information about learners' real abilitiesand achievements. Effective evaluation occursduring all phases of the program. at the begin-ning of the program (preassessment), throughoutthe program (assessment of progress), and at theend of the program (postassessrnent).

2. Institutional factors. Learners may be requiredto gain specific skills, knowledge, and attitudesin order to receive a degree, academic credit, alicense, or a certificate. They may have to takestandardized tests based on criteria specified byan educational institution, business/industry, or

25

certifying or licensing agency. Learners need tobe prepared for this, and your assessment ac-tivities must take these criteria into account.

The way in which grades are to be reported andother types of evaluation requirements may alsobe part of institutional policies. In some cases,report forms are provided for instructors, and aschedule for submitting evaluation reports isfixed. There might be an established gradingsystem that reflects specific levels of achieve-ment. Your grading system will need to be com-patible with the institution's evaluation policiesand procedures.

Psychological factors. Some adult learnershave negative memories of tasting. For any num-ber of reasons, they may not :.ave performed welland, hence, may be anxious about being tested.Even those who have performed well on tests inthe past may still be anxious in a testing situa-tion. If the fear is great enough, it can negativelyaffect a learner's performance and yield testresults that do not reflect the learner's trueabilities.

If anxiety is a factor, you need to emphasize theways in which assessment activities can helplearners. Through identifying their strengths andweaknesses, learners can determine where theyare in relation to their objectives and strengthenskills as needed so as to meet those objectives.

You should also openly acknowledge test-anxietyand its commonness. Your reassurance can goa long way in helping learners (1) feel more com-fortable with testing and (2) discover positivetechniques for dealing with evaluation.

In addition, you can help learners cope better withtesting by providing them with tips and strategiesfor test-taking.

30

Page 28: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Learners' skill and knowledge levels. Learners'skill and knowledge levels affect how you developa test and write test items. For example, a testbecomes less valid if it requires students to usecommunication skills they don't have. When thishappens, learners cannot demonstrate theirknowledge because they cannot communicate itto you. Thus, it is essential that you identifylearners' skill and knowledge levels and developtests appropriate for those levels.

Special needs. Learners may have specialneeds that require modifications in the testingprocess. For example, a visually impaired studentmight have difficulty reading a written test. Aphysically impaired student might have problemsin fitting written responses in the spaces providedor in operating equipment that is not modified forhis or her use. Any such needs must be provid-ed for in the evaluation and testing situation.

Individual needs and objectives. Adult learnersbring their own individual needs and objectivesto a program. These are often as important asyour overall program objectives and should beidentified and included, in individualized form, inassessment activities.

Learning domains. Learning is often categorizedinto three basic areas: knowledge (cognitive), skills(psychomotor), and attitudes (affective). Inevaluating student performance, you need to en-sure that you evaivate learning in each area, asappropriate.

For instance, knowledge may be assessed viawritten tests. Skills may be assessed via perfor-mance tests. And attitudes (e.g., regard for safety,punctuality) may be assessed via written tests andclassroom observations. Learning in all threeareas may be measureddirectly and indirectlyvia a performance test.

In each situation, you need to identify (1) the typeof learning covered by the objective(s) and (2) themost logical and efficient way to assess learners'achievement in that domain or those domains.

Student performance objectives. Just as stu-dent performance objectives define the knowl-edge, skills, and attitudes to be learned, they alsoestablish a basis for evaluation. As you identifyobjectives to be covered in the program, youshould also identify the specific criteria by whichlearners' achievement should be assessed. Thesource of the objective (the institution, the oc-cupation, or the learner) should also be thesource of the criteria.

3. There is no one best grading system that youshould use in all situations. Your particularsituationthe requirements of the institution inwhich you are teaching, course objectives, andlearners' objectiveswill determine the type ofgrading system you should use.

Level of Performance: Your written responses to the self-check items should have covered the same ma-jor points as the model answers. If you missed some points or have questions about any additional pointsyou made, review the information sheet, Student Evaluation, pp. 6-21, or check with your resource personif necessary.

e26 31

Page 29: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.
Page 30: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Activity

....100)

Effective evaluation includes the use of effective tests, and in most cases,you will need to develop your own. For this you will need some knowledgeof the principles of test development. For a general introduction to the dif-ferent types of tests, their uses, and how to develop them, read the follow-ing information sheet.

HOW TO CONSTRUCT A TESTKnowledge about who, what, and where you are

teaching is not enough information on which todevelop a test. In order to develop an effectiveevaluation system, you will also need to know (1) the

Written Tests

characteristics and uses of good written and perfor-mance tests and (2) the guidelines for constructingdifferent types of test items. Let's look first at writ-ten tests.

Written tests may be objective or subjective, theymust be valid, reliable, and usable. Sample 5 pro-vides some basic information on these three prin-ciples of test development.

Objective vs. SubjectiveAs noted in the sample, whether you use objec-

tive or subjective test items in a given situation canaffect the reliability of your testits ability to con-sistently measure what it is supposed to measure.Each type of item has distinguishing features, ad-vantages, and disadvantages.

Objective items. These are items in which thecorrect answer will always be the same, no matterwho is giving or scoring the test. The following aresome major advantages in using objective testitems:

Objective test items can usually be answeredfairly quickly. Therefore, you can test for a fairlywide range of information in one test.These items can be scored relatively quickly,easily, and accurately using an answer key.Judgment calls and opinions are neithernecessary, possible, nor desirable in scoringthese test itemshence their reliability is high.

The disadvantages associated with objective testitems include the following:

Objective test items can be time-consuming todevelop and write.They are limited in the types and levels of learn-ing that can be evaluated. They test mainly thelearners' ability to identify or recall information,not their ability to analyze, synthesize, andorganize their knowledge.

28

Even professionally developed test items maybe ambiguous, and learners are often given nooppoilunity to justify or qualify their answers.Even the best-written objective items cansometimes allow learners to guess the correctanswer.

Subjective items. These items require the scorerto iudge and interpret individual responses to eachitem. In order to ensure that a subjective test is asvalid and reliable as possible, you will need toestablish an objective answer key that includes allthe major points that learners should have coveredin their responses.

The following are some distinct advantages to theappropriate use of subjective test items.

They can test learner's ability to analyze, syn-thesize, and organize their knowledge; and theycan cover a broad or narrow range of informa-tion.Subjective test items provide learners with ameans for creative problem solving. There aremany cases in which there may be no singlecorrect or workable solution to a problem. Asubjective item may be most appropriate inthese situations.

There are two primary disadvantages in the use ofsubjective test items:

More time is needed for readirG and scoringthese items than for scoring objective items.The validity and reliability of a subjective-itemtest may be lower than that of an objective-itemtest. Since the value of an answer depends inpart on how well it is defended and supported,it may be hard to judge its value in absoluteterms.

3

Page 31: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

SAMPLE 5

;? , ,',

,-.c--41,, , .,,u,;,..,;., .,

,.,..--. , , ..

, L

itat,effeqt4nleeprellithitityiCeititg-:-M.'ilteett,0.-0-: high

144640iftinifiletAthd`re:-

-naftetet)chAttingeti )41e;;tvi.p.-

IDWI1Vi(ttt1;

nt

'thifeeoyoalictor,

Atsett4iittiA-onsunitn -

Aftidente!,

ani .0:11,teet;

-lingt004ni-2itiotorzcou

more time=ini4olved,and-..therefore

yet(tirrtele trAtitiji to,tiee*ouroil.njudgirientt6-eintine-A tiiiiiKet:gt,tit(ti#ttietiWotitcUntekelaieet

racticai andbu°s"ableit}syour own situation:

;en

Aentegen'

r. rrrvalT

1P9644.1Eilf110,00 6

*-460410.1**YeT*07,440541PYOlfkK4:66nspnenf.ttnei:suiretnentiiiiiiii r kneVittlicqi:;i161:064neattifil4iitieettije3iite;if

SOURCE: Excerpted from Module D-2, Assess Student Performance: Knowledge, pp 8-10

29:4 4

Page 32: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Experienced teachers have noted that the objec-tive test is usually broad, but thinincluding wideranges of subject matter and many items, but neverat any particular depth. The reverse is true for thesubjective test; it is usually narrow, but deep. Hence,many teachers consistently provide tests that includeboth types of items, not infrequently in somethingapproximating a 50-50 mix.

Planning The TestTest designincluding both the format and the

way in which the instructions and items arewrittencan also affect the test's validity, reliability,and usability Sample 6 outlines important pro-cedures for developing and administering a goodwritten test. As noted in the sample, careful planningis a key element in developing a good test. Thefollowing are a few simple steps for planning a test:

Determine when you will need to administer thetest.Identify the specific student performance objec-tives to be covered by the test.Identify specific criteria, or standards, by whichlearners' achievements will be measured.Determine the type(s) and number(s) of testitems that are most appropriate for evaluatingeach kind of performance (knowledge, skills,attitudes).Develop the test items.Determine the relative value of the objectivesand the criteria, and establish the point valueof test items accordingly.Plan how you will present the test to thelearners; include such information as why youare giving a test, the skills and knowledge areasthat will be covered, whether the results will bepart of a final grade, and if so, what percentageof the total grade they will constitute.

30

Developing Objective Test ItemsFour major types of written test items are generally

considered to be objective. multiple-choice, match-ing, true-false, and completion items.

Multiple-choice items. In multiple-choice items,the learner selects from a list of possible responsesthe one correct (or most nearly correct) answer toa problem or question. These items are especiallyuseful for evaluating learners' knowledge, includingtheir ability to recall, identify, and synthesize a widerange of information in a single test.

Multiple-choice items provide a simple responseformat for the learner and quick, accurate scoringfor the instructor. Initially, you may find multiple-choice items somewhat time-consuming and difficultto write. But many teachers feel that the advantagesin administering and scoring them more than com-pensate for this. You will also find that, with ex-perience, good multiple-choice items become easierto write.

The problem or question stated at the beginningof the item is called the stem. The possible answersto the item listed are called responses. The correctresponse is simply referred to as the answer, andthe incorrect responses are referred to es distracters.

The stem can be stated as a problem to be solved,a question to be answered, or an incomplete state-ment to be completed. By varying the stem, you canadapt the multiple-choice format for your particularuse. Sample 7 shows examples of multiple-choiceitems with various stem formats.

Regardless of the stem format you choose, stemsmust be clearly and unambiguously stated. Studentscannot be expected to provide a correct answer ifthey cannot clearly understand the question or theproblem.

When devising responses, remember that notonly must all distracters be completely incorrect, butthey must also be plausible. If distracters are par-tially correct, the test is not a valid or reliablemeasure of the learners' knowledge. Likewise,distracters that are not plausible are a deadgiveaway to the correct answer. This also results ina test that is not valid.

It is equally important that the answer be com-pletely correct. If students know the information, orhow to solve the problem, there should be no doubtabout the correctness of the answer.

.15

Page 33: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

SAMPLE 6

SOURCE' Excerpted from Module D2, Assess Student Performance. Knowledge, pp 10-15

The way the responses are written should not pro-vide clues to the answer. All responses should bephrased in such a way that, when linked with thestem, they result in complete, comprehensible,grammatically correct statements. All responsesshould be of approximately the same length. Inex-perienced test writers often write longer correctanswers, with more qualifiers, in order to ensure thatthey are completely true. The noticeable differencein length is a clue to the correct answer.

A response pattern is another clue to the correctanswers. When the correct answers always appearin the same position in the response list (e.g., the

31

answer is always in position B) or if they run in apredictable pattern (e.g., B-E-C, B-E-C, B-E-C), theresult is a response pattern. Students generallybecome alerted to this and find that they can relyon the pattern, rather than on their knowledge, tosupply the correct answer.

The optimum number of responses for a singleitem is usually four or five. With this number, thechances of guessing the correct answer are lowerthan if there were fewer responses. More than thisnumber of responses can be confusing or take toomuch time to review. This can cause the learner tolose sight of the point of the question.

"1E

Page 34: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

SAMPLE 7

lioui 'How much

01,cricumfereitikogii,c0indee?."j-:

Matching items. Matching items require learnersto identify significant relationshipsbetween wordsor concepts, between principlcs and their applica-tions, or between symbols and objects, for example.These items can be fairly easily constructed.Because they require learners to do little writing,matching items can be administered and scoredrelatively quickly.

Matching items are generally composed of twolists. One list (usually the one on the left) is madeup of premises, while the other contains responses.A relationship is stated between the two groups, andthe learner is asked to match each premise with thecorrect response. Sample 8 shows a portion of amatching item, including directions.

When developing matching items, you should pr..-vide approximately 10 to 15 elements in each list.(Dealing with more items tends to be confusing forthe learner, and providing fewer items encouragesthe learner to guess.) In addition, you should alwaysprovide more responses than premises in order to

32

help reduce the possibility of guessing the correctanswers by process of elimination. One list, usuallythe premises, should be sequentially numbered, andthe other, alphabetically lettered. This helps to avoidconfusion in both answering and scoring.

Because students need to read the list ofresponses more than once in order to identify thecorrect response for each premise, the responsesthemselves should be brief. The premise statementsneed not be short, but they should be clear andunambiguously stated.

When producing (e.g., typing) the test, be sure thatthe whole matching item appears on a single page.It is time-consuming and unnecessarily confusing forlearners if they have to look back and forth betweentwo pages to view all possible responses to eachpremise.

As with other types of test items, you will need tosafeguard against response patterns that can helplearners predict the correct responses.

Page 35: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

:.`

fritwolidiffefent hiEitt*StidOSybo16:-,thellist

atai*CklypiOr.Wiih-thO:006ot dekiptio ncfIs'i A n 010.'9fieaotlipnOtiot:. .

True-false items. On true-false items, learnersmust identify whether each given statement is trueor false. You can use these items to assess a widerange of factual knowledge, as well as learners' abil-ity to interpret and summarize this knowledge.

True-false items do not require learners to do ex-tensive writing and therefore can be administeredand scored fairly quickly. A problem with true-falseitems, however, is that they are often less reliablebecause students have a 50-50 chance of guessingthe right answer.

Statements for true-false items are presented inlist form. Students are usually asked to mark theirresponses to the left of each statement. Responsescan be marked in various ways (e.g., by writing trueor false or by circling Tor F), as shown in sample 9.

True-false statements should be specific, brief,and clear. Learners must know precisely what thestatement is saying before they can determinewhether it is true or false. Each item must also beeither completely true or completely false, If itemsare partly true and partly false, learners have toguess whether they are to respond to the true or thefalse portion of the statement, and the test is nolonger valid.

The validity of true-false items will also be reducedif the wording provides clues or if learners have toguess at your definitions. For example, absolute

33

words, such as always, never, completely, all, more,and totally, tend to be associated with falsestatements and thus provide clues to the answer.Ambiguous words, such as usually, frequently, some,often, and several, require learners to guess at yourpersonal definition of these words. How often is fre-quently? How much is some? Be specific when citingmeasurements of time, volume, o' other quantities.

Some instructors give in to the temptation to liftdirect quotes from textbooks or lecture notes whenwriting true-false statements. Although this maysimplify the instruct 's task, it does not produce avalid means of assessing learners' understandingor knowledge. Rather, it tests learners' ability tomemorize what they have read or heard. You candevelop good true-false items by focusing on theessential information covered in your course.

True-false items can be developed in ways thatare more thought-provoking and less a guessinggame. In contrast to what was said earlier, items canbe developed so that some are partially true and par-tially falseif the directions specify that "if any partof a statement is false, mark it false." Or instead ofsimply being asked to identify whether a statementis false or true, learners can be asked to correct anyfalse statements. The third example in sample 9shows this latter type of modified true-false item.

:18

Page 36: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

SAMPLE 9

iCiij00410eiioacti:. its ;

MAtelOtif:'46616,-.

s ledge a:stick under

Completion items. A statement in which a keyterm or phrase is omitted and replaced by a blankspace is a completion item. The learner must com-plete the statement by supplying the correct wordor phrase.

These items can be used to assess learners'recall of a wide range of facts, information, and prin-ciples, rather than their ability to simply identify a cor-rect answer in a group of provided alternatives.Because the answer is not supplied, the guessingfactor is lower than with matching, multiplechoice,and conventional true-false items. Like these othertypes of items, however, completion items can bescored fairly quickly and easily.

You can begin to develop completion items bywriting a set of brief statements that ;over the mostimportant information included in the course or in-structional unit. The next step, as you review eachstatement, is to identify a key term, symbol, ornumerical value; delete it from the sentence; andreplace it with a blank. If an article or other word thatwould provide a clue to the answer (e.g., a, an, this,the, these) precedes the blank, you should omit thattoo. Note that only significant terms should be omit-ted from completion items.

34

Items should provide enough specific informationso that students can know how to respond. This isperhaps best illustrated by the following example ofhow not to write a completion item.

The moons of the planet Jupiter were firstdiscovered in

Learners could provide a variety of correctanswers to this statemr 1610 (the year in whichthe planets were first discovered); a specific area ofthe riight sky (or, for that matter, the sky itself); orbitaround Jupiter; Italy (the country in which Galileo,the discoverer, lived); and so on.

Depending on the knowledge that you are testing,this item could be better stated as follows:

The moons of the planet Jupiter were discoveredin the yearThe moons of Jupiter were first discovered whenthe planet's position was at degrees ofarc.

The astronomer who discovered the moons ofJupiter was living in the country of at thetime of the discovery.

R9

Page 37: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

To ensure that the item is both reliable and valid,there should be only one possible correct answerfor each term that is omitted. When scoring comple-tion items, it is important to remember that the word-ing of correct responses need not be identical tothose in your answer key, as long as the meaningis the same.

Although more than one key term may be omit-ted from a statement, the item should require onlybrief, factual answers e., short lists of words, sym-bols, or numbers). If you want to determine whetherlearners can supply and discuss a body of informa-tion, essay items are more appropriate.

Regardless of the length of the answer, blankspaces for all items should be the same length soas not to provide clues to the correct term. To sim-plify both the answering and scoring of completionitems, some instructors insert only small blanks inthe body of the statement itself and provide largerblanks for the responses in a column to the left ofthe item.

Developing Subjective Test ItemsThe most common type of subjeutive test item is

the essay item. Essay items require the highest levelof communication skill and call for the most complexthinking patterns of all the test items discussed sofar. They can be a valuable tool for assessinglearners' ability to organize, analyze, synthesize, andpresent related information. They can also be usedto test learners' ability to solve problems.

Depending on the specificity of the essay ques-tion or statement, you can cover a fairly wide or nar-row range of subject matter in one test item. You canrequire responscs that vary in length from a single,short paragraph composed of one or two sentences,to a few short pages of prose.

A potential drawback to the use of essay items isthat learners who have a low level of verbal com-munication skills are at a disadvantage. Hence,essay items may not provide a valid or fair test oftheir knowledge or analytical abi:ities. If such writ-ten communication skills are not ci itical to successin the occupation, essay items are protiably not anappropriate assessment technique

Both the administration and scoring of essay itemscan be quite time-consuming. Students need moretime to develop responses than they do for shortertest items. Scoring, too, takes longer because it re-quires careful reading, interpretation, and judgmenton the part of the scorer.

35

Types of essay items. One of the most impor-tant factors in de.'eloping an essay item is determin-ing precisely the knowledge that you will evaluateand the type of essay iten, that is most appropriatefor assessing that knowledge. Common types ofessay items include su,nmary, critical analysis orcritique, comparison and contrast, and case study.Each type has somewhat different applications.

You might request a summary if you want to as-sess learners' ability to recall, identify, and report theessential elements of a body of information. For ex-ample, in a studio photography course, a summa;iessay item might be stated, Summarize the advan-tages of using a moderate telephoto lens in portraitphotography.

You might ask learners to provide a criticalnalysis or critique when it is important to deter-

mine whether they can identify, analyze, and weighthe strengths, weaknesses, limitations, and uniquecharacteristics of a particular method or tool. An ex-ample of this type of essay item might be stated,W'ite a critical analysis of the uses of the 35mm lens1;i landscape photography.

You might use a comparison and contrast essaywhen you want learners to identify and discuss thesimilarities and different,es in the uses of variousprinciples, rnethods, tools, or machinery. An exam-ple of this kind of essay item might be, Compare andcontrast 35mm and 21/4" format cameras.

If you need to test learners' ability to apply knowl-edge, you can develop a case study or recommen-dation essay item. This is an item in which learnersare asked to analyze a situation or problem arid,based on their analysis, recommend an effectivesolution. These items should be presented in theform of case studies that describe realistic cir-cumstances (o.g., a supervisor's role in dealing withtardy employees, a malfunctioning automobile thatexhibits certain symptoms).

Case study items can also take the form of videopresentations. Learners view videotapes of varioussituations and then propose solutions. No matterhow you choose to present the case study, the cir-cumstances presented shot.'d be realistic and pro-vide enough detail that learners have relevant andsufficient information to work with.

Constructing items. After you h; identified theappropriate type of essay item, the next step is todevelop it The essay item should be stated fully andspecifically, clearly outlining (1) the type of informa-tion or solution that learners must provide and (2)how learners should present that information.

40

Page 38: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

You should identify all the factors that are beingsessed as part of the item. Learners need to know

specifically what elements will be scored and theweight or point value of each one. These elementscan include content, organization, grammar, punc-tuation, spelling, clarity of the writing, and others.Remember also that it is not legitimate to score anyfactors that are not part of the learning experiences.

Many instructors recommend that communicationskills be scored separately from the content. In thisway, both the instructor and the learners can distin-guish between mastery of the subject matter andquality of communication skills.

Scoring. You will need to develop a scoring keythat identifies all of the information and other fac-tors that are being tested. The next step is to deter-mine the point values of the item as a whole and ofeach factor. With all these data clenrly outlined, youwill have a comprehensive scoring Key that will helpmin:mize the subjective element in scoring essayitems.

In order to ensure that the item is as objective andvalid as possible, you should compare the item withthe answer key and ask yourself these questions.

Performance Tests

Is the item stated in such a way that it clearlyrequests the responses that are listed in yourkey?

Do the instructions clearly state the way inwhich learners are to respond?Given the lengt., of the test, is there sufficienttime for learners to fully respond to the item?

Bear in mind, however, that because each lc ,i ierwill organize and synthesize information differently,flexibility is a key element in scoring essay items.You will find that, with some practice, you candevelop an answer key that is comprehensive andspecific, yet flexible enough to allow for variationsin individual responses. Many instructors find thatit is easiest to be objective and evenhanded in theirscoring if they read all learners' responses to a singleessay item in one sitting.

If there are learners in your program who are in-experienced with essay items or whose basic com-munication skills are not well developed, you maywant to plan practice sessions during which they candevelop these skills. Practice sessions and testsshould begin with short essay items that require lesswriting and less complex organization and buildgradually to longer items.

There are many situations in which pencil-and-paper tests are not the most appropriate evaluationtechnique. Performance tests can provide a practicalmeans of assessing how well learners can (1) carryout a process or sequence of tasks and/or (2) createa product.

Performance tests can be a valuable tool in anumber of phases of instruction. They can be usedas pretests to determine learners' current skill levelsbefore beginning the program or a new learning ex-perience. (One caution: Pretests of a performancenature should never be used if potentially dangerousequipment, materials, or operations are involved.)

Performance tests can also be used as progressassessments, as final assessments, and as a partof the actual learning experience. Learners can usethese tests as a self-check in practicing their skillsand determining their own levels of achievement.This is a common feature of competency-based edu-cation (CBE) programs.

In developing performance tests, it is important toremember that they must conform to the same highstandards as written tests: they must be valid,reliable, and usable.

Performance tests (see sample 10) generally in-clude three major components:

A clear statement (directions) concerning thetesting situation (What task is the learner ex-pected to perform? What product is he/sheexpected to produce? How will the performancebe evaluated?)A checklist of criteria, or performance stan-dards, to be metA rating scale

II

s

Directions/Testing SituationThe first step is to identify the objective or objec-

tives to be covered by the test. If properly stated,each objective should specify the action to be per-formed, the conditions under which performancemust occur, and the criteria to be met. For instance,a health care program for hospital aides might in-clude the following objective:

Objective: Given a hospital bed occupied by a pa-tient who is dependent on mechanical life-supportsystems (condition), change the bed linens (ac-tion) without injury to the patient (criterion), while

4111maintaining the life-support link at all times(criterion).

36 41

Page 39: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

:f

:

'

i"i"."i'g

rCf i I

1

.).

.1

,

i SJ

.%

t

,

)ti

'4.:....;........,:i.".....--,.;..:T

, ' ,, e ', ' ',

,. , ..,.....,,,...n, ,. ,,,,,,-. sr, . ,

'?,-..4,4',s.:

--4,t`` "",":"4":-.?,.4"-'

. .>.

4r

T)t

tdr

4.

lirrtg , ',V;.,'. '''''.

414,4::74,, '0,- 4, ...{' ..S.....44,,V, ,.....k,' 4' i& J/14,..<41,. ,.... , C,,,

GtYti

gtn44as

Vtt .' ter

ft el

4f-4,

.i'u. Iv0 ..

't.s'...S",,,,,-,"14',i4tef. 14111VP'fr

10al;6"*.4';,...*"1..cig 4P;a72aeree4.?0_14,on-4 " Ape.:

ii;as, ,4,..{te ,..t5,.

'4:"., ,!I,

'');

AV "kL' '''''

lE r.4"1:*

.04.1, 4.11.,

c.4

4 t+ `1

401 intft,

"'

..if.;.

.4 04 r,

!..,,`

.".,,..7,,P7..,),

$,Ti,

tr414'W

II:27 , 4.;0I44.,4. 0 X4

S.-f:

0-'t 44

ce: md.44 ill

TN/

45 C,

"4:$C. V, `,., 4

>i,. VI

iii 'IZ III '..113 V113c t- A%At 2.;

z. 4 'To .-,.1 P '1+ v, 0.

I " 'tap 1 ..e-$.

Tio,'"'" '4,-an ,i,A4,V ill rF ''k,','15!.* t 4- 61.,.,,i:41:

??'cs '.;'' vo," ''''',1 0 -t`1., .;

1,10., IA 49; Ciiti N4IIO S. "'Ile CV0.4.10.. . 44

V'' ...1 lir. 1. -4...01 °.Y* -)%

e) .m,i sl r ar.-e.in. t.t. i:.40 ....40v'

tne. I= rco.o' ict ''''' ,r' '''' -A,aufz ." ' S: 4 4 ga3 os.... r*'',,u,..ar 01; ,c) .44,,, 41

44 .4,443!, i'0, Alt SS X.i't.

t"; ;.10 417

AO 40, CO ON- TX7' r.1

,

l;':

,24

0,4,-

,..^,,,,,'

4;"

..

' i

,/

_,,f

46ek

. P,

.-:ng.ri.!"... 4 '

.

.,,,,

,.+is i

,,V. a"1,

ti. .

, t 0.

.4' t:44

Sf

4

,., ., 4'

t,

i..

*' r..

i,

4. .

4

,I'.Jr

,

f....,

.'

0..-

'4'

.400

,

bi,

IY-

cci

LL

C

co>

C

coC

.006cl)rn

O

E.co

CCO

U.

co

00CD

Page 40: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Based on the objective(s), you next need todescribe the situation in which the learner's perfor-mance will be evaluated. The situation should in-cluda specific details about the task(s) to be per-formed and the learner's role. For example:

Situation: As a hospital aide, you are changingbed linens. Your next patient is confined to bedand dependent on mechanical life-support equip-ment. You need to change the linens on that pa-tient's bed. Based upon the criteria listed below,your instructor's evaluation will determine whetheryou are competent in this task.

In some cases, the situation you devise will be anactual one. In this particular case, where thelearner's errors could be life-threatening to a real pa-tient, you would need to devise a simulated situa-tion. For example, another learner could role-play thepatient.

Criteria/ChecklistWhat criteria, or standards, will you use to evaluate

the learner's performance? Will you evaluate thelearner's performance by observing only the processfollowed . . . only the product produced . . . both?What other elements (e.g., time, attitudes) need tobe evaluated?

You may evaluate process only in situationswhere no tangible final product is produced (e.g.,Greet visitors to the office). You may evaluate prod-uct only, which saves time, when all key processelements can be inferred from observing the finalproduct. In other words, the only way the learnercould have arrived at an acceptable final productwas by following the correct process. Frequently, youwill want to evaluate both process and product.

A finished product provides valid evidence of pro-ficiency because it is readily measured, and the mea-surement can be done with relative objectivity. If youare evaluating a product, you will need to list all thefactors or elements that should be present and ob-servable in the product. For example, product criteriafor the linen-changing objective might include thefollowing:

The patient was uninjured by the linen-changingprocessThe life-support systems were still in placeThe linens on the newly made bed had tighthospital corners

Process, too, must be evaluated in situations suchas the following:

Critical inner parts of a final product are hiddenfrom view (as in a surgical dressing, where inneraseptic measures and medications are of evengreater importance than the outer visible areas)

38

Sanitation or safety procedures are of extremeimportance and must be meticulously observedEvaluation involves "live customer" work (as incosmetology, where an error in the hair dyeingor permanent process could have long-lastingunhappy consequences)

For example, steps involved in the linen-changingprocess might include the following:

The learner adjusted the bed to the properheightThe learner removed the old linens carefullyfrom the bedThe learner communicated reassuringly withthe patientThe learner ensured that life-support systemswere always in place

Often, attitudes are crucial to occupational skills.And they're not as impossible to measure as isusually supposed. Measurement just requires a lit-tle attention and thought on your part. You need todetermine what behaviors would indicate thepresence of the desired attitudes.

We all say things like, "He's a very courteous per-son" or "She's a real leader." How do we come tothose conclusions? We get clues from a person's ac-tions: he says please and thank you; she volunteersfor extra work, and her peers ask her for help.

Some skills involve important time factors thatmust be measure i. The batter must be beaten forfive minutes. The typist must be able to type 45words per minute to achieve entry-level competence.Time standards may come from specific industrydocuments, such as flat-rate manuals. Or they mayderive from general trade agreements and unioncontracts.

From your own experience and expertise in thefield, you will be able to identify many of the criteriafor successful performance. Other sources for iden-tifying criteria can include employers, labor unions,advitury committees, and manufacturers of theequipment or materials that are being used.

Tnese criteria, logically sequenced and clearlystated in observable terms, form the basis for theperformance checklist. The checklist should have atleast the following important qualities.

The checklist should be short enough to makeit practical for you to use. Perhaps 5-10 itemsare sufficient for a simple skill, 13-20 items (atthe most) for complex skills.The criteria included must be critical to suc-cessful performance of the skill. Minor or trivialcriteria just make your evaluation job more dif-ficult and time-consuming.

(I 3

Page 41: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Each criterion should have some qualitativebase. It is not enough to record that the studentdid something; he/she may have done it poorlyor very well, and this needs 'o be shown.The items must be simple and unambiguousquickly read and understood by you and thestudents.Items should be stated in parallel terms (e.g.,all in the past tense).

If you find that your checklist is impractically long,review the criteria and determine whether each itemis truly essential. If the checklist cannot be short-ened, you should consider dividing the original per-formance objective into two separate objectives thatcan be assessed individually.

Rating ScalesThere are a number of types of rating scales that

you can use. The type you select and the format youuse will depend on the nature of the performanceto be evaluated.

Yes/no rating scale. One kind of rating scale isthe simple yes/no scale (see sample 10). The learnereither did or did not perform the task at an accep-table level, and you mark yes or no accordingly. Noassessment is made of the degree to which the per-formance was acceptable. This type of system ismost appropriate when increments of quality are notan issue.

Let's say, for example, that a performance test re-quires the learner to change the blade on a 1/8-inchtable saw. One of the criteria states, The nut on thebolt must be completely tightened, and there is noacceptable, observable range of tolerance. In thiscase, a yes/no scale would be appropriate.

Multi-level rating scales. In many instances it isimportant to assess the relative quality of the per-formance or product, or the degree to which a char-acteristic is present. Furthermore, there may bebroad tolerances for successful performance insome areas. In any of these cases, it is most ap-propriate to use a multi-level rating scale that in-cludes a range of performance levels.

There are two basic types of multi-level ratingscales. Probably the most commonly used is anumerical scale, in which numbers (e.g., 1 to 5)designate various levels of quality for each factor thatis assessed. It is important that the level of qualityrepresented by each number be clearly defined andexplained to learners. The following is a list of stan-dard equivalents:

5 = Excellent4 = Very good3 = Good2 = Fair1 = Poor

Some instructors prefer to use verbal responsecategories that contain evaluative words, ratherthan numbers. The Teacher Performance Assess-ment Form (TPAF) at the end of this module is anexample of this type of rating scale. Whether you usenumerical or verbal rating scales for evaluation, fivelevels of performance seem to provide the optimumrange of responses.

Format. Rating scales can be arranged in a vari-ety of formats. Among the most popular approachesis to place columns of blanks or squares to the rightof the performance items (criteria). At the top of thescale, the rating categories are listed from left to right(e.g., 1, 2, 3, 4, 5, or their verbal equivalents). Directlybelow each category are the columns of blanks orsquares. When each item is evaluated, a check markis placed in the appropriate blank or square. TheTPAF in this module and the checklist in sample 10use this type of format.

A graphic scale is composed of a horizontal linethat has response categories marked along the line.Each item on this type of checklist has its own in-dividual scale, and ratings for each item are placedin the appropriate place. Graphic scales are par-ticularly useful when brief, task- or product-specificdescriptions of performance levels for each item arenecessary. Sample 11 shows a graphic rating scalefor one item.

Levels of ratings. In rating a learner's perfor-mance, there are at least two levels involved. Youneed to rate performance in terms of each in-dividual criterion, or iiam, on the checklist. You alsoneed to rate the individual's overall performance.

39 d4

Page 42: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

SAMPLE 11

If you look at sample 10, pp. 37, for example, youwill note that a simple yes/no rating scale has beenprovided for rating each criterion. However, at thetop of the performance test, there is a multi-levelscale (1-5, with clear descriptors provided for eachlevel) for rating the learner's overall level ofperformance.

Administering the TestBefore you administer a performance test, you

should provide students with copies of the checklistto use as a reference while practicing their skills. InCBE programs, performance checklists are often apart of student learning guides, modules, or otherinstructional packages. If it :s at all possible in yoursituation, you should ask learners to determine whenthey are ready to have their performance formallyevaluated and set a convenient date with you.

When preparing for the test, it is your responsi-bility to be sure that all the necessary materials,tools, and equipment are available. The actual areain which the test will take place should be clean andfree of distractions.

40

It is best to administer process-oriented tests in-dividually because you will need an unobstructedview of each step in order to evaluate the learner'sperforn ance fairly. In some instances, it may bepossible to administer the test to a small group oflearners at one time, provided that (1) you have aclear view of each person's activities and (2)everyone in the group is prepared for the test at thattime.

If product evaluation is the focus of the test,students may simply submit their products to ye ..1 forevaluation.

After the performance test is completed, you mustdetermine whether each student's performance asa whole meets at least the minimum acceptable stan-dards. If not, the learner will need to practice in orderto bring performance up to acceptable standards.Regardless of the total performance rating, youshould schedule follow-up conferences with learnersto discuss how and why you have arrived at yourassessment.

45

Page 43: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

1

I

1

1

iO ( Optional

Activity

\*14121111

iOptionalActivity

31

For more information about how to construct tests and test items, ill maywish to read one ^r both of the following supplementary references:

Denova, Test Construction for Training Evaluation, is a readable manualthat provides much useful information on evaluation, including cleardirections on how to develop an evaluation plan, tests, and individualtest items.Mager and Beach, Developing Vocational Instruction, discusses, stepby step, the processes involved in developing instruction. Chapters 5,6, 7, and e, pp. 28-51, explain how to establish course objectives andprerequisites, and identify types of performance and appropriate mea-suring instruments.

To gain additional skill in developing tests, you may wish to refer to one ormore of the following PBTE modules:

Module D-2: Assess Student Performance: KnowledgeModule D-3: Assess Student Performance: AttitudesModule D-4: Assess Student Performance: SkillsModule K-4: Provide Instructional Materials for CBEModule K-5: Manage the Daily Routines of Your CBE Program

41

4 6

Page 44: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

42 zi 7

1

1

1

i

i

Page 45: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Read the following test developed by a vocational instructor. As you read,keep in mind the principles and steps involved in developing an effectivetest. After completing your reading, critique in writing all essential factorsof the test.

UNIT TEST

Unit Test: Business Letters

Directions: The following is a test that covers the complete learning unit on business letter.-.,. It includesboth a written and performance test. The performance test will be administered at the end of the time allowedfor the written test.

True -Fa'se

1. In a full-blocked style of business letter, the date I.ne, complimentary closing, and writes identificationall begin at the center. All other lines begin at the left margin.

2. In the semiblocked style of business letter, the date line, complimentary closing, and writer's identifica-tion all begin at the center. The first line of each paragraph is inaented five spaces.

3. The simplified style of business letter includes the following features Al l lir es begin at the left margin,a subject line replaces the salutation, punctuation is open, the complimentary closing is deleted, andthe writer's name and title are typed on one line in all capital letters.

4. In the blocked style of business letters, all lines begin at the left margin.

43

4 8

Page 46: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

MatchingDirections: The list on the left provides definitions of various parts of a business letter. The list on the rightcontains the names of the various parts. Match each name with its correct definition.

5. the writer's name or position or both

6. the writer's and/or typist's initials

7. indicates the way in which the letter was sent

8. a closing phrase, such as "Yours truly"

9. the name and address of the person who is receivingthe letter

10. indicates that the letter is written to a specific personor department

11. an opening phrase, such as "Dear Ms. Morris"

12. the names of people other than the addressee who willreceive the letter

13. a business's printed name and address

14. the month, day, and year that the letter is typed

15. the typed name and address of the business, if letterheadstationery is not used

16. the text of the letter

17. used in place of salutation; typed on a separate line inall capital letters

18. indicates that the writer speaks for the company

a. letterhead

b. date line

c. inside address

d. salutation

e. message

f. complimentary closing

g. company signature

h. writer's identification

i. reference initials

j. mailing notation

k. carbon copy notation

I. return address

m. subject line

n. attention line

CompletionDirections: Fill in the blanks.

19. The heading of a business letter includes and

20. The opening of a business letter includes and

21. The body of the business letter includes

22. The closing of the business letter includes , and

Multiple-choiceDirections: Each of the items below is fo;lowed by four possible responses. Select the one correct responsefor each item.

23. If a building name is included in the inside address, it should:

a) be typed in the same line as the street addressb) be typed in the line directly below the street addressc) be typed in the line directly above the street addressd) be typed in the same line as the addressee's name

44 a 9

Page 47: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

I

24. House and building numbers in street addresses should be:

a) typed in numerical figuresb) typed out in word formc) typed in the same line as the street named) typed preceding the street name

25. Which number used as a street name should be spelled out?

a) 24b) 100c) 17d) 4

26. The U.S. Postal Service abbreviation for the state of Maine is:

a) MNb) MEc) MAd) MI

27. The U.S. Postal Service abbreviation for the state of Maryland is:

a) MAb) MEc) MNd) MD

Essay28. Your text describes eight ways to lengthen a short letter. List and describe all of them.

Performance Test

Learner's Name: Date:

Directions: Given access to the neeeed office supplies, edit and type a handwritten draft of a businesslet , , using painted letterhead stationery, 12-r.:L!, type, and the blocked style. Your performance will beassessed using the following criteria. The instructor will check the YES or NO box to indicate whether youperformed each step correctly or whether the prodjct met each criterion as indicated.

The learner:YES NO

1. selected appropriate stationery

2. identified all errors (e.g., spelling, punctuation, capitalization) in the handwritten draft E.:1

3. corrected all errors accurately

4. made corrections without changing the meaning of the lette- [1

5. used appropriate margins and spacing given the type of stationery, the size of the type,and the length of the letter

6. completed the task in 1/2 hour or less

4559

Page 48: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

The completed letter:7. included all standard parts in the heading, opening, body, and closing

8. was laid out in accordance with standard block style-

9. was error-free

10. was neat and clean (e.g., free of smudges)

Number of criteria attained

Number of criteria not attained

Total score

Instructor's signature

Level of Performance: All items must receive a YES response. If any item receives a NO, the learner andinstructor should meet to determine what additional activities the learner needs to complete in order to reachcompetency in the weak area(s).

Page 49: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Compare your written critique of the test with the model critique given below.Your response need not exactly duplicate the model response, however, youshould have covered the same major points.

MODEL CRITIQUEAlthough this test has its strong points, many sec-

tions need to be thought through and revised beforeit is administered to learners.

Some of the sections, as well as individual items,are clearly written and easy to follow. But others areso poorly and ambiguously stated that learners haveto guess at the intent of questions or statements inorder to respond. This does not create a valid, fair,or reliable test.

Test directions are inconsistent. Again, some areclear and direct while others are incomplete, unclear,or simply absent. The general directions at the begin-ning of the test, as far as they go, are fairly well writ-ten. However, they are incomplete. These directionsshould include information about (1) the total pointvalue of each section of the test and (2) the amountsof time allowed for the written and performance por-tions of the test. Directions for each type of item onthe test are, in general, poorly written. Specific in-stances will be discussed in relation to the varioustypes of test items.

The test is fairly clearly laid out, readable, andeasy to follow. There are, however, some distinctmechanical problems in the layout (e.g., in the true-false section, no space is provided for learners'responses).

In addition, there are toe many different types ofitems in this test. Because of this, lea: ners have toswitch mental gears frequently in order to respondto different types of items. This cao make it difficultto concentrate on the comer; of the test. Perhapssome types of items could be rewritten in anotherformat. Fr instance, the completion items might bewritten , short essay items and grouped with theother essay item (e.g., Item 22 could be stated:Name the three basic elements that form the closingof a business letter).

Although the; a are too many different types ofitems on the test, the number of items seems to betoo few, given the intendea scope of the test. Andmany of those items that are on the test cover a nar-row range of information. Test directions clearly state

47

that the te. will cover a complete unit on businessletters. Thus, the learner might assume that a widerange of knowledge and skills is to be tested. How-ever, the test does not do this, and the result is aspotty and sometimes trivial assessment of thelearners' knowledge.

The multiple-choice section, for example, could beexpanued to include more items with a greater vari-ety of content. As it is, it deals only with a few aspectsof one fe'ture of the business letterthe address.Two of tnese six itemsthose that deal with U.S.Postal Service abbreviationssimply test learners'rote memorization of information, which is probablyless critical than other knowledge that might betested.

In addition to the general strengths andweaknesses of the test, the individual sections of thetest have both weak and strong points. These areaddressed individually in the following discussions.

True-FalseThe two main problems with this section are tnat

there are (1) no directions for how the item is to beanswered and (2) no spaces in which to writeresponses. Complete directions, includiag the pointvalue of individual items, should always be providedfor each type of item on the test.

The individual items seem to be stated clearly andspecifically enough that they are either true ur false.However, the instructor may have provided clues tothe correct answers by writing longer, more detailedstatements for true items than for those that arefalse. Maintaining a fairly uniform length of true-falseitems is an important factor in developing items thatdo not help learners guess the correct answer.

This portion of the test is shortnot a problem initself. However, this section would have been a gooc,place to include a greater number and variety ofitems to better cover the scope of the unit.

52

Page 50: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Matching

Generally speaking, the matching section is oneof the better developed parts of the test. It com-prehensively covers an area of knowledge. Both thepremises and responses are clearly written, andthere is no predictable response pattern that pro-vides clues to the correct answers.

On the other hand, the possibility of guessing isincreased by having exactly the same number of re-sponses as there are premises. There should alwaysbe more responses than premises. And Item 15, aswell as the directions for the performance test, pro-vides a clue to the correct response to Item 13(letterhead).

Another problem with this section is that the direc-tions do not explain how to mark responses and nospaces are provided for responses. The directionsare otherwise clearly stated, but this is an importantomission.

CompletionThis is one of the most poorly developed sections

of the test. The few directions given do not includethe point value of the items and do not explain thestatement, Fill in the blanks. With what?

All the items are written so ambiguously thatlearners could supply a number of different correctanswers for each item. For example, Item 21 couldbe correctly answered with such responses aswords, information, tt xt, message, and many others.Items like this turn test-taking into a guessing gameand produce tests that are neither reliable nor valid.

Another problem with the completion items is thedifferent-sized blanks. The lengths could provideclues to correct answers (in this particular case,however, the statements are so ambiguous thatlearners need all the help they can get!).

Multiple-choiceThe instructor did remember to write some direc-

tions for this section, and what is there is understand-able. But once again, the directions are incomplete.They do not indicate how learners are to mark theirresponses, nor is the point value of the items stated.

Overall, the items are clear and well written. Noclues to the correct responses are provided in thestems, and there is a good number of responFros tochoose from.

One exception, which lowers the quality of thissection and reduces the validity of the test, is Item 24.two of the responses provided are correct. Since thedirections state that learners are to select the onecorrect response, which are they to choose?

Earlier criticisms of this sectionthe triviality andlimited scope of the questionsare also points forimprovement.

Essay

The essay item competes with the completion sec-tion for being the worst element of the test. It is statedin such a way that it it-.. .. :erg ly a test of learners' rotememorizalicn of information provided in the text Itmight indeed be important to determine whetherlearners know how to lengthen a short letter.However, the wording of the question 'mplies thatlearners must reproduce memorized, printed infor-mation, rather than answer from their own bank ofknowledge.

In addition, directions are incomplete: the approx-imate length of the required response is not in-dicated, nor is the point value of the item cited. Also,very little space is provided for the response. Evenif the instructor wanted a brief answer, there is stillnot enough room, given the range of variation in howlearners word their essays and how large they write.

Performance TestThe performance test is the best portion of the unit

test, although it would have been more usable hadit been put all on one page. (This would also haveallowed more space for the response to the essayitem.) It is clearly laid out, complete, and easy tofollow. The directions here are the best on the test.They are fully and clearly stated, as is the objective.

The items on the checklist are well written and in-clude all important steps and criteria. It should alsobe noted that only important steps and criteria areitemized. This contributes to the manageable lengthof the test.

The yes/no rating scale for the items seems ap-propriate for the objective, and the checklist is easyto use, both for marking and for interpretingresponses.

A space for additional comments is the one ele-ment that is missing from the test. Otherwise, theperformance test seems complete.

Level of Performance: Your written critique of the test should have covered the same major points asthe model critique. If you missed some points or have questions about any additional points you made,review the material in the information sheet, How to Construct a Test, pp. 28-40, or check with your resourceperson if necessary.

48 c3

Page 51: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Learning Experience III

OVERVIEW

0.! 4-iktudiik.0*;:.5#459;:o60:piitIqcgiisotltrOffq-

, .4..: ' ,,.. ; '

$,::0 , , .', A=.

-., .

you:- 1 :evaluating your competency 4-criticfulnilthelOtrpptpr"t, perfor-mance,'4911iiiitioreplani,. for use adult learners...

.,, tott, corn:-iificomp.ted EeriiiiiiiiiWitfitil.114001.0ritligOvpv61462:,

7,-

49c4

Page 52: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Activity

,.......j

Adults need to be involved in determining both what they learn and how theirlearning will be evaluated. The skills they gain in identifying appropriateevaluation techni,Jes will enable them to become better lifelong learners,both in and outside of the classroom. For information on how to developevaluation strategies that are particularly appropriate for use with adultlearners, read the following information sheet.

EVALUATING ADULTS: WHAT'S APPROPRIATE?When adult learners come to the classroom, many

of them bring a history of independence and respon-sibility in coping with life situations. They also havea future that demands that they continue to learn andbe able to make effective judgments in many newcircumstances. Because of this, it is vital that youdevise and implement evaluation strategies that canhelp students lea. n how to realistically identify Pridevaluate their own resources, abilities, and knowl-edge.

In many classrooms situations, learners do nothelp determine how and what they will learn, or howand on what basis they will be evaluated. Unfor-tunately, this situation does not provide learners withessential skills in evaluation. Evaluation strategiesfor adults are most effective when traditional authorityroles are de-emphasized, and the learner's role asan autonomous, responsible adult is emphasized.This can happen when learners are actively involvedin identifying and establishing their own evaluationtechniquestechniques that will enable them tomeet their objectives.

Admittedly, there will be learning situations inwhich neither you nor the learners will have manyoptions in establishing learning objectives or select-ing the types of evaluation devices, criteria, orgrading system to be used. Even so, there is stillmuch that you can do to help learners gain more skilland autonomy in dealing with what may first appearto be arbitrary demands. One of the best strategiesis to use collaborative approaches.

Using Collaborative ApproachesCollaborative approaches involve learners in iden-

tifying and developing learning objectives, evalua-tion techniques, and criteria. These approachesacknowledge that adults have not only a right buta responsibility to take a more active role in deter-mining all phases of their learning experiences, in-cluding evaluation.

Collaboration has several advantages>. It en-courages a high degree of learner involvement andcommitment, both from individuals and from thegroup It can enable learners to gair, valuableunderstanding of how they learn.

50

\

Collaborative approaches provide a valuablesource of preassessment information, since learnersmust identify their current skills and knowledge inorder to determine their learning objectives andsubsequent evaluation techniques. An additionalbenefit of collaboration is that evaluation is often lessthreatening to learners when they are involved indesigning the evaluation strategies than if the strat-egies are generated solely by an outsiae authonty.

Depending on your situation, the results of a col-laborative approach can range froin an evaluationsystem that is totally learner-designed and -scored,to one that includes many standardized or instructor -designed and -scored tests, papers, and projects.Collaboration may not be appropriate :n all types oflearning situations, however. For example, you mayfind that it is too time-consuming and cumbersomea process if time in your program is short andlearners are unfamiliar or uncomfortable withdeveloping their own learning and evaluationstrategies. The extent to which you and learners cancollaborate will depend on a number of factors, asfollows.

Institutional requirements. You may be teachingin a situation in which the instructional elementsobjectives and evaluation toolsare determined bya variety of external sources. In an industry-based

55

Page 53: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

training program, for instance, the firm supportingthe program may have established both the objec-tives to be met and the evaluation instruments to beused. Or licensing requirements for a given occupa-tion may govern what objectives must be met, anda standardized licensing exam may provide the finalbasis for assessment. Learners seeking to enter anew occupation or to reach a higher occupationallevel will need to meet objectives covering the com-petencies required to meet those career goals.

There may be nothing that you can do to changethese requirements. However, you and the learnerscan still collaborate in identifying additional learn-ing objectives and specific evaluation techniquesthat will help them to meet their unique personalobjectives.

Individual needs. In some situations, both thelearners and the instructor have numerous optionsin determining the content and methods of evalua-tion. In these cases, evaluation can be based largelyon learners' individual needs, skills, and objectives.

Imagine, for example, a typing course in whichlearners are enrolled for different reasons. One isa pharmacist who has enrolled to become a moreaccurate touch-typist. Another is employed as atypist but wants greater skill in typing statisticaltatiles.

After a brief discussion, the instructor discoversthat accuracy, rather than speed, is the pharmacist'smain concern. So they develop a learning objectivebased on that need: Touch-type 30 words per minuteat 100% accuracy. In the case of the professionaltypist, the instructor and learner develop the follow-ing objective: Type three different types of statisticaltables at 45 words per minute, with 90% accuracy.

Learners' experience. Learners' familiarity withdifferent types of learning situations will also af-fect the amount of involvement they can haveatleast initiallyin making instructional decisions.Learners who are familiar only with conventional,instructor-designed courses may be uncomfortablewith a collaborative approach. At first, it may seemlike an overwhelming amount of new work. Thus, itis important to introduce collaborative activities insmall, manageable portions. You will need to takethe time to teach them how to de,,c_ylop appropriatelearning objectives and select appropriate evalua-tion strategies.

Learners who are experienced with collaborativeapproaches will probably be more comfortable withthe process and able to move more quickly in iden-tifying learning objectives and relevant evaluationstrategies.

Students' experience with the subject area will alsoaffect how much they can contribute to the process.

Learners who have little knowledge of the subjectarea will require much more direction in identifyingand establishing options in learning and evaluation.This will be less of a factor in an advanced course,in which learners are quite familiar with the prin-ciples, techniques, and tools of the occupation.

Establishing the Basis for EvaluationWith all these variables in mind, how can you work

with learners to establish a basis for evaluation? In-structors have made use of a variety of techniques,tailored to their situations. Three collaborative ap-proaches that are especially appropriate for use withadults are group decision making, learning con-tracts, and grading contracts.

Group de sion making. As the name implies,this process it solves class members, as a group, inidentifying and selecting the learning objectives tobe met and evaluation strategies to be used. Usingthis technique, you can also allow learners to incor-porate individualized learning and evaluationstrategies into the group plan.

Although learners should have a fair amount offreedom in this process, you will need to providesome information and structure so that they will havea frame of reference for decision making. Onestrategy is to provide forms on which learners canidentify the following:

Learning objectivesEvidence of accomplishment (proof that the ob-jective has been met)Evaluation strategies (how the evidence will begathered: a written test, oral test, or perfor-mance test; instructor review, peer review, orself-evaluation)Criteria for evaluationTiming of evaluation (when the evidence will begathered)

You will also need to discuss with them any institu-tional, occupational, and degree /licensing re-quirements that will affect the evaluation system tobe developed.

Group decision making may not be appropriate inall learning situations. If you are teaching in a coursein which learners' needs and goals vary greatly, theymay find it difficult or impossible to reach consen-sus about evaluation strategies. Or, if their groupcommunication skills ate not well developed, thegroup "consensus" may not truly reflect the needsand objectives of all group members. Some maydominate discussion, while others say nothing. Youwill therefore need to determine whether group deci-sion making is really an appropriate means of deter-mining evaluation strategies in your situation.

I

Page 54: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

The following is a four-step approach that is com-monly used in group decision making:

1 BrainstormLearners form committees (of upto about seven persons), which meet to gener-ate possible (a) learning objectives, (b) evidenceof accomplishment, (c) evaluation strategies,(d) criteria, and (e) timing of evaluation.Although this process is designed to enablelearners to make their own decisions, they mayneed your help. During the committee sessionyou could visit each committee to ensure thatthe meeting is running smoothly and produc-tively. Learners may have questions that needto be answered before they can reach anydecisions.By the end of a set amount of time, learnersmust reach a consensus about all these fac-tors. The committee then selects a represen-tative to report the group's recommendationsto the rest of the class.

2. ListAt the close of the committee process,the class reconvenes. Each committee repre-sentative presents recommendations to theclass. In your role as a facilitator, you can helpgenerate a positive, productive session byrecognizing the value of everyone's ideas andaccepting all recommendations as being opento discussion. You should list all recommenda-tions on a flip chart or chalkboard.

3. DiscussAfter all recommendations havebean submitted, class members discuss eachrecommendation in terms of its ability to meetindividual and group learning needs. You mayneed to direct the discussion in order to keepparticipants focused on the topic and ensurethat all members have the opportunity to par-ticipate. You may also need to respond torecommendations on the basis of your occupa-tional experience and to suggest items thatmay have been overlooked.

4. DecideWhen the pros and cons of all recom-mendations have been discussed, you and theclass must reach consensus about instructionalobjectives and specific evaluation strategies forthe course.

Learning contracts. The !earning contract canhelp learners to clarify their objectives, documenttheir learning and evaluation plans, and committhemselves to the work they have contracted to do.Contracts also prnvide a fair and clear basis forevaluation, since the learner knows exactly what willbe evaluated and when and how it will happen.

The amount of learner input :nto the contract canvary greatly, depending on how structured the pro-gram is and how independent the learners are. Insome long-term programs, for example, learners

52

who are experienced with the subject matter may berequired to develop contracts that include somelearner-designed evaluation instruments.

In most other situations, however, modified learn-ing contracts are used. For example, an instructormight develop a variety of learning objectives, learn-ing experiences, and evaluation activities. Evalua-tion activities might include teacher-graded papersand projects, tests, self-evaluations, and peerevaluations. Learners would then select from thoseoptions and contract to participate in specific learn-ing and evaluation activities that would enable themto meet their objectives.

There are three preliminary steps that you needto take before developing learning contracts:

1. Assess group and individual training needs.2. Present general course objectives and ihstitu-

tional requirements related to evaluation. Identi-fy both nenotiable and nonnegotiable objectivesand evaluation activities.

3. Discuss with learners Jw evaluation relatesto learning objectives.

Once you are satisfied that learners understandthese basic principles, you and they can worktogether to develop individual contracts for a giventime period. The following are steps in developinga learning contract:

1. Identify learning objectives (which ones andhow many).

2. Identify learning experiences that will enablethe learner to meet these objectives.

3. Determine what will be accepted as evidenceof accomplishment (e.g., a finished product,written test results, learner's performance, apaper, a presentation to the class).

4. Establish criteria for evaluating each item ofevidence.

5. Identify appropriate evaluation strategies: Nowwill the learner's achievement be evaluated?Who will evaluate the learner's workthe in-structor, the learner, peers, experts in the field9

6. Set a target date for accomplishment of objec-tives and a date for evaluation to take place.

Together you and the learner will have to reviewall the elements of the contract before a final agree-ment is made and signed. Your experience in thefield will enable ynu to judge whether the contractis complete and realistic. You may need to suggestitems that the learner has overlooked and identifypossible problem areas in the sequence or schedul-ing of activities. When agreement is reached aboutall the elements of the contract, you both s:gn anddate the contract, and each of you keeps a copy forreference. Samples and 13 are examples of learn-ing contracts.

5 7

Page 55: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.
Page 56: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.
Page 57: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

1

Throughout the program, you should meet withlearners individually on a regular basis to review thecontracts in light of tile progress they are making.You will need to determine jointly if adjustmentsneed to be made to the original contract and, if so,where and how they need tt., be made. At the endof each contract, you will need to meet with individuallearners to determine whether the contract has beenfulfilled.

Grading contracts. Grading contracts providelearners with options (1) in the relative weight ofevaluation activities (as in sample 3, p. 19) or (2) inthe amount of work they will perform. With thisinformationand with knowledge of their needs,resources, and abilitieslearners can help deter-mine how they will be graded.

In the first situation, a learner who typically doesvery well on quizzes, but whose note-taking skillsleave much to be desired, could contract to havequiz grades constitute a higher percentage of thefinal grade. The learner's grades on individual ac-tivities must still be earned, but the relative weightof those grades is determined, at least in part, bythe student.

In the second situation, the final grade is det r-mined by contract. The contract specifies how muchwork must be completed for are A, how much for aB, and so on. Because of the outside responsibilitiestypical of adult learners, some of your students mayhave limited time available for study. Those studentsmay need to take your class to meet the re-quirement.: established for a c:agree or by theiremployer. However, they may wish (or have time)to do only the minimum work required to obtain apassing c-ade; those students could contract to com-plete the work required for a C. Other students whohave the time, the resources, and the need to com-plete more work or more in-depth work, could con-tract to complete the work required for a 8 or A.

Regardless of the grade contracted for, the stan-dards to be met must be nonnegotiable. A studentshould not earn the C he or she contracted for simplyby completing the work; the work must be completedso that it meets the precisely measurable standardsestablished for academic achievement or occupa-tional competence.

To develop grading contracts, you will need towork with learners to specify the learning objectivesand level of performance they must achieve. Gen-erally. learners will need two to three weeks' ex-perience in the program before they have a basisfor developing such contracts. After signing the ccri-tract, both you and the learner should retain a copyfor reference.

,55

Selecting AppropriateEvaluation Techniques

Regardless of how or by whom the evaluationstrategies are established, it is essential that theevaluation procedures selected are appropriate toeach learning objective.

Consider, for example, an automobile repaircourse in which learners prepare for entry-levelemployment. Occupational standards would provideevaluation criteria and suggest the appropriate formof evaluation. For the learning objective Gap sparkplug to industry spech,cations, the instructor wouldprobably use a performance test.

If it were important to verify that learners hadknowledge of the operating principles of the inter-nal combustion engine, a cognitive written testcould be the most appropriate evaluation instrument.

In many situations, self-asseP'..inent is an effec-tive evaluation tool, particularly for adult learners.Learners can use answer keys to assess their ownperformance on written tests; they can use perfor-mance checklists to rate their competence on per-formance tests. Self-assessment can be used inpreassessment, to help learners identify their skilland knowledge levels prior to beginning new learn-ing experiences. It can also be used within learningexperiences, to give learners an opportunity to prac-tice and sharpen their skills before taking gradedtests.

Through self-assessment, learners can completeand score tests in private, without the fear and anx-iety often associated with tests given by an instruc-tor. Then, when they feel they are ready, they canapproach the instructor for a graded test. Initially youwill cften need to help learners determine theirreadiness for the graded test.

There are some commercially available self-assessment tests that can assess a range of skillsin a variety of occupations. You may wish to checkwith counselors at your institution or state curriculumcenters in order to determine what is available inyour field. You can also develop your own self-assessment devices, using either performance testsor cognitive pencil-and-paper tests. These should beaccompanied by a scoring key or model answers,as used in this module, that learners can us to scoretheir own work.

Another excellent means of self-assessment is thejovnial, or log, in which learners record their learn-ing plans and activities. Journals are especiallyusefui in helping students document their learningactivities and determine what they need to do in

60

Page 58: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

order tc meet their objectives. They can be used inconjunction with learning contracts or as an indepen-dent learning and evaluation strategy. Depending onyour situation, you may ask learners to submit theirjournals to you for review and discussion, or you maychoose to have them keep private journals for theirown personal reference.

Journal-keeping styles usually depend on in-dividual preference. Some learners prefer to keepa simple log in list form and to jot down their learn-ing activities, the objectives they ;lave achieved, theirown performance it..tings, and their plans for meetingnew objectives. Others prefer to keep a prose jour-nal in which they discuss these things in more depth.There are a number of possible variations on thesebasic approaches that learners can adapt to theirown personal styles.

If you are teaching in a program in which grades,licensing, academic credit, and institutional re-quirements are not an issue, y.1 can determineevaluation devices and criteria largely on the basisof individual learners' needs and abilities. But thoseneeds may vary quite a bit. Some adult learners maystill want and need to use a variety of tests that willgive them specific feedback about their progress.Others will prefer an approach which provides themwith feedback but which is much less formal andstructured.

In such a situation, ycu could give the tests thatthe first group desires and a'so provide for informal,less structured strategies for the others. Informalone-to-one conferences, group discussions, and pn-vete journal writing would probably meet the needsof the second group. Note, however, that in this typeof learning situation, learner participation in test tak-ing and other evaluation activities should be volun-tary.

Selecting the Appropriate EvaluatorWho should evaluate the performance of adult

leap ners? The learner? The instructor? Employers?Workers? Learners' peers? The answer to this ques-tion will vary with the situation. In many instances,all these people may be involved as evaluators atdifferent phases of the learning experience. Butmany educators contend that involving the learner

in self-assessment activities provides the most ef-fective form of adult evaluation. Thus, it is importantthat you build as many self-assessment activities intothe evaluation plan as possible.

Self-assessment enables students to learn how toform objective judgments and promotes a greatersense of self-direction and responsibility. Therefore,it helps them gain valuable skills that they can usethroughout their lives, in any situation. Granted, thereliability and usefulness of self-assessment will de-pend, in part, on learners' self-confidence and skillin forming objective judgments However, with prac-tice, learners can improve these skills and becomebetter lifelong learners.

Learners can also become skilled in reviewing andevaluating their peers' performance, products, orknowledge. This makes peer evaluation an excellenttechnique for use with adults. A learner might re-quest another's help in evaluationand later switchroles and become the evaluator. The student whoacts as the peer evaluator can use a performancechecklist or an answer key of simp'y provide infor-mal feedback when rating the other's performance.After the initial peer evaluation, the students shouldmeet to discuss the results and possible strategiesfor improvement.

Peer evaluatioii t,an allow learners time to prac-tice skills and obtain objective feedback from peo-ple who are not involved in determining grades. BothIPPrners benefit from this process. One gainstdwuable feedback about his or her achievement, theother sharpens his or her skills in identifying essen-tial criteria and thus reinforces or her own skillsand knowledge.

You will need to stress to learners that they mustbe as objective, nonthreatening, and honest aspossible in evaluation. Only in this way can theirpeers obtain useful, reliable infori nation about theirabilities. Note, too, that peer evaluation should usual-ly be voluntary. Some students may be extremelyuncomfortable in either role, thus limiting the .alidi-ty or reliability of the evaluation. However, with prac-tice, many learners can acquire skill and ease withpeer evaluations.

5661.

Page 59: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

There may be times when neither you nor thelearners are the most accessible or most expert eval-uator available. If a learner is completing an inde-pendent study in an area beyond your expertise, forexample, an employer or worker with the neededexpertise may need to be asked to evaluate theresults. Likewise, in a cooperative education pro-gram in which learners are on the job part of thetime, their on-the-job supervisors may need to takeor some of the responsibility for performance evalua-tions. The same might be true in an industry-basedtraining program.

In most situations, instructor evaluation shouldstill be a substantial part of the evaluation system.

/-iOptional

Activity

O NO 2 i4%li 0

In fact, the instructor's role as an evaluator is prob-ably the mcst vaned of all. With your knowledge ofthe subject area, you can help learners identifyessential learning objectives, appropriate criteria,and evaluation strategies. You can act as facilitator,helping learners to devise learning contracts anddevelop self-assessment and group decision-makingskills. And of course, in conjunction with many col-laborative evaluation approaches, you will need todesign, administer, and score written tests, designand evaluate performance tests, and submit evalua-tion reports to the institution in which you areteaching.

You may wish to read one or both of the following supplementary references:

Knowles, The Modern Pra':tice of Adult Education, examines the prin-ciples and practices involved in effective adult education. Section III,"Helping Adults Learn," pp. 222-247, discusses strategies to helplearners diagnose their learning needs, formulate learning plans, andidentify appropriate evaluation techniques.Knox, Adt.it Development and Learning, is an in-depth and readable studyof the many social, personal, and physical factors that affect adultdevelopment. Chapter 7, pp. 405-469, provides a detailed overview ofthe factors that influence and direct adult learning.

57P2

Page 60: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

The following case study describes how Hary Dunfry, a new technical in-structor, developed evaluation plans for use with adult learners. Read thecase study and critique in writing Mr. Dunfry's evaluation practices, bothstrengths and weaknesses.

CASE STUDYHary Dunfry hung up the phone and looked

around the garage. "Not bad," he thought to himself.He had a three-bay garage now and steadycustomers who sent in referrals all the time. He'dcome a pretty long way from the high school kidwho'd picked up extra money doing car repairs forneighbors.

ft Id now, one of his customers, who worked forthe local school system, had just called. She'd askedhim if he would be interested in teaching a beginningauto repair course in the night program for adults.He'd said, "Sure."

Hary Dunfry, a teacher, eh? Well, school had beenall rightexcept for the teachers who made youcrazy with their rules and pop quizzes. But then, hedidn't have to be that kind of teacher. He'd also hadsome good ones who really knew their stuff and whomade sure that you learned it. But at the same time,they had cared about who you were and what wasimportant to you. That seemed to be more of whatbeing a teacher was all about.

In a few days there was supposed to be an orien-tation program for new instructors. Experienced in-structors were going to be teaching some techniquesto use in the classroom so that you'd have somethingto go on when you first started. Hary hoped they'dhave something useful to say.

As it turned out, the last night of the orientationprogram had been particularly interesting. Someonehad talked about working with adult students. Shesaid they needed independence and the freedom tohave a say in what they were going to learn and howthey were going to be evaluated.

It really had made a lot of sense. Wasn't that whathis teachers in school had been doing when they lethim work on some of his own projects and thenhelped him troubleshoot problems when he gotstuck? Of course, he'd had to do regular studyingand take tests, too, but that was usually fair. Hemight not have been able to work on some of theprojects if he hadn't learned the material that theteachers required.

58

All in all, it seemed like the best approach for hiscourse might consist of some readings, followed bya few tests on the things that he knew for sureeveryone would need to know. Then, by watchingpeople in the shop, he could tell if they were learn-ing the hands-on skills that they needed. Other thanthat, everyone could decide about the additionalthings they wanted to learn and how they would beevaluated. That way, everyone could get the mostout of the class.

The first night of class Hary handed out an outlinethat described the learning objectives included in thecourse, the name of the book they would need toget, and a list of the six tests that would be given.Hary explained that the reason he was giving thesetests was that they covered material that everyonewould need to know if they were going to learnanything about auto mechanics. He also explainedthat what they ci:d in the rest of the course was upto them. They would decide what they were goingto learn and how they'd be tested, and then he'd helpthem in whatever way he could.

He said that he'd heard about a process calledgroup decision making that he'd like them to try outthat evening. He explained how the process worked,and described all the steps involved. He then askedlearners to form committees and decide on what elsethey wanted to learn and how they thought theyshould be evaluated. For example, did they want totake tests? Did they just want tc talk to him abouttheir work? Or were there other things they wantedto do?

After the learners had formed committees, Harileft the room while everyone met. He didn't want toput any pressure on them about what they shoulddo, and he thought it would be better if he just lefteveryone alone for a while.

When they were done talking, everyone regroupedand Hary asked for their recommendations. It turnedout that they all had pretty much agreed that the sixtests he had on the outline were plenty. As for the

e3

Page 61: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

rest of the course, the general consensus was thathe probably knew the most about what they shouldlearn, since no one had worked much on cars before.After class he reflected, "So much for group deci-sion makingthey all decide not to decide. Well, ifthat's what they want, I guess that's it."

By the fourth week of the course, Hary wasn't surewhere to go next. Everyone was still showing up, andthey seemed to like the class. But they were all wcrk-ing at different speeds. As a result, it was hard todecide when to start the next project on engine tim-ing, much less when to give the test on checking forworn-out belts.

He'd thought after that first night that it would bepretty simple since everyone had agreed on thesame thing. But, it wasn'tthey all turned out to beso different.

For instance, there was Lonnie, who didn't evenknow how to check his tire pressure when he startedthe course (and he thought he was going to savehundreds of dollars in car repair bills by the time he

finished the course!). He'd done all right on the tests,but Hary wasn't sure he'd ever get air in his tireswithout blowing them up. Still, he sure seemed tobe happy fooling around the engine, and he knewnow to check his oil level now. So, maybe it wasn'ta total loss.

But then there were people like Janet and Steve,who were making rapid prcgress through the hands-on activities. They evidentally had done some workon cars before. They kept talking about exhaustsystems and rebuilding carburetors. He didn't wantto hold them back just because other people weren'tas far along.

On the other hand, Steve hadn't passed any ofthe tests so far. For that matter, he hadn't even goneso far as to write his name on the test. Maybe hejust was too embarrassed to put his name on a testhe hadn't studied for. At first, Hary thought maybeSteve had been sick and hadn't had a chance tostudy, so he asked him if he wanted to take me testsagain later. But Stev3 said no, he'd just live with it.Well, okay. He didn't want to push anybody.

59 P4

Page 62: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

NOTES

60 P5

Page 63: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Compare your written critique of the instructor's performance with the modelcritique given below. Your response need not exactly duplicate the modelresponse; however, you should have covered the same major points.

MODEL CRITIQUE

Hary Dunfry's approach to teaching and evaluationwas based on the very best of intentions. Drawingon some good general advice given at the instruc-tor's orientation session and his memory of goodteachers he'd had in hi;jh school, Hary set out toteach a class in which adult learners had choicesaoout what they would learn and how they would beevaluated. Clearly, he had a lot to offer the learnerswho returned to the course week after week.

He should be commended for relying on hisknowledge and experience when he provided struc-ture for the course by identifying the performanceobjectives and evaluation devices essential for anintroductory course.

Harv's choice in using a collaborative approachwas a good one, and he made a very sensible movewhen he explained to learners all of the steps involvedin the group decision-making process. However,Harv's silence on the subject of grades left a nu:nberof important questions unanswered. Were theplanned tests to be graded? What other learning ac-tivities would be graded, and ho tv? If so, whatassessment results would contribute to a finalgrade? Would grades be submitted to the institution?Did the school require that learners be graded?Harv's students needed to have this information inorder ..o make more informed decisions about theiractivities and appropriate evaluation strategies.

In addition, even though he was acting out ofrespect for the learner s' right to make independentdecisions, Harv's decision to leave the classroomduring the committee sessions .vas unwise. Heneeded to be present in order to provide informa-tion and guidance during the process. As it was,beginning learners simply didn't have enough infor-mation to work with when making decisions. Hence,their choice was "not to decide."

61

Hary was thrown off track by the class's generalagreement to do what he thought best. It was possi-ble, sincr) this was an introductory course, that heknew the most about specific learning options.However, he could have presented learners with avariety of learning experiences and evaluation ac-tivities ,o choose from, rather than simply tellingthem to decide for themselves.

It is also possible that, given learners' range ofskills and experience, Hary and the class would havebeen better off using individual learning contracts.

Harv's concern about holding back some studentsbecause others were slower could have been elimi-nated if he had conducted some preassessment ac-tivities to obtain basic information about learners'skill and knowledge levels. It turned out that hisstudents had a wide range of skill levels, which couldhave been accommodated earlier if he had knownabout them. As it was, Hary relied on observationand informal discussion to obtain this information.These are indeed good tools for assessment if usedsystematically, but they need to be supplementedwith other strategies Oat can provide specific infor-mation at an early point in the course.

Perhaps Hary would have discovered, throughoreassessment, that Steven had some advancedskills in auto mechanics but lacked other skills, suchas the ability to read or write well. With this knowl-edge, he could have done two important things.(1) work with Steven to identify more advanced proj-ects to work on independently and (2; evaluateSteven's knowledge through oral, rather than writ-ten, tests.

Hary was correct in explaining why he haddeveloped the tests that he planned to use, but heshould have also explained how the test resultscould help them, as well as how they affected overallgrades.

P6

Page 64: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

It also seems that, even though the use of somewritten tests was appropriate for assessing learners'knowledge, Hary needed to go further in his plansand provide for evaluating learners' hands-on skillsthrough observation. He needed to develop someperformance tests to accurately evaluate learners'skills. Learners could also have used these tests forpractice and self-assessment. Maybe, in that way,Lonnie could even learn to fill his tires without anexplosion.

In addition, a group consensus about taking thetests that Hary had planned did not mean that every-one in the class had to take the same test at the

same time. There was no apparent need for this. Andif he had arranged for learners to take tests whenthey were ready, there would have been less confu-sion about when to administer the next test andwhen to move on to the next learning experience.

For the most part, Hary Dunf:y's efforts in work-ing with adults to develop evaluation strategies werepretty well founded, but they needed to be furtherthought out and developed.

Level of Performance: Your wntten crit:que of the case study should have covered the same major pointsas the model critique. If you missed some points Dr have questions about any additional points you made,review the material in the information sheet, Evaluating Adults. What's Appropriate?, pp. 50-57, or checkwith your resource person if necessary.

Page 65: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

is

1

Learning Experience IVFINAL EXPERIENCE

For a definition of "actual teaching situation." see the inside bac',. cover

63

On.,0

Page 66: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.
Page 67: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

TEACHER PERFORMANCE ASSESSMENT FORMEvaluate the Performance of Adults (N-6)

Q

L

Directions: Indicate the Ibvel of the teacher's accomplishment by placingan X in the appropriate box under the LEVE' OF PERFORMANCE heading.If, because of special circumstances, a perfomance component was not ap-plicable, or impossible to execute, place an X in the N/A box.

Name

Date

Resource Person

In planning for evaluation, the instructor:1. identified and accommodated the following:

a. institutional requirements

b. occupational requirements

LEVEL OF PERFORMANCE

e 4.

,''t 4so ..:-o o kea

000000c. individual needs, objectives, and skill and knowledge

levels El OD 0d. learners' special needs Ill III Ill

In using collaborative approaches to developing evaluationstrategies, the instructor:

2. worked with learners, to the extent possible, to identify:a. learning objectives

b. learning experiences

c. evidence of accomplishment

d. evaluation criteria

e. evaluation strategies

f. timing of evaluation

If the group decision-making process was used, theinstructor:3. introduced the process and the procedures to be followed III III III4. accepted and listed ali learner recommendations for

discussion 0E105. guided and contributed to the discussion of the recommen-

dations as needed El 0006. ensured that consensus was reached 000

Li0

O 00000O 00O 00O 0E1O 00

65

70

.cb ..oo 4,-0

Page 68: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

0 4.

1 4)0

4°0 4.

If learning contracts were used, the instructor:7. provided learners with adequate structure and directions

for developing learning contracts 08. worked with individual learners to design appropriate

contracts 0 El CI El9. provided a copy of the signed contract for both the learner

El El Eland the instructor

10. reviewed contracts with learners on a regular basis andmade any necessary adjustments El El 0

If grading contracts were used, the instructor:11. identified the relative weight ranges fur evaluation activities

El Elor the amount of work to be performed for each grade

12. provided copies of the signed contract for the instructor El Eland the learner

In selecting the appropriate evaluators, the instructor:13. ensured that a variety of evaluators were involved through

such strategies as:El El Erna. self-evaluation

b. peer evaluation

c. instructor-based evaluation . .

d. expert worker/supervisor evaluation

In developing each written test, the instructor:14. identified appropriate types of test items based on the per-

formance objectives L_I

15. developed each test item according to the guidelines forconstructing that type of item El ID El 0

16. developed a test chat was valid, reliable, and usable

17. provided clear, readable copies for everyone . . . .

18. provided full, clear, and simple directions for the whole testand for each type of test item El El El 0

19. grouped tho same kinds of test items together . . El_

20. included no more than three different types of test items

21. developed clear and unambiguous test items, with theguessing factor minimized El

22. developed an appropriate scoring key

6671

oo

Page 69: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

,eQs 4. k b:r 0 0 -...kit, 0 0

4' q° c? 4//r

In developing each performance test, the instructor:23. stated the performance objective fully and clePrly . . . . El El El El24. provided clear directions concerning the testing situation 00025. laid out the checklist/rating scale so that it is clear and easy

to use El El El El26. included all essential criteria and only essential criteria 0 ELIO27. stated all items clearly

28. arranged all items in a logical sequence

29. ensured that the test was:a. appropriate for the type of performance involved Hb. reasonable in length 000

In administering each test, the instructor:30. provided a comfortable testing environment El El El El31. explained the purpose of the test prior to administering it

411132. allowed sufficient time for the test to be completed by all

or most learners El 0 ElAfter administering each test, the instructor:33. reviewed the results and identified the following:

a. learners' skill and knowledge areas that need to bestrengthened

b. any necessary changes in instructional plans

In determining learners' grades, the instructor:34. informed learners of the basis and system for determin-

ing their grades El El 0 El35. maintained complete ana accurate record,.; of learner

performance ELIO36. assigned grades that were consistent with the institution's

grading system 000Level of Performance: All items must receive N/A, tiOCD, or EXCELLENT responses. If any item receivesa NONE, POOR, or FAIR response, irie instructor and resource person should meet to determine whatadditio al activities tt- instructor needs to complete in order to reach competency in the weak area(s).

72

67

Page 70: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

NOTES

6873

Page 71: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

0 ABOUT USING THE NATIONAL CENTER'SPBTE MODULES

o

0

Organiz.,tionEach module is designed to help you gain competency in aparticular skill area considered important to teaching suc-cess. A module is made up of a series of learning experi-ences, some providing background information, some pro-viding practice experiences, and others combining thesetwo functions. Completing these experiences shot,d en-able you to achieve the terminal objective in the finallearning experience. The final experience in each modulealways requires you to demonstrate the skill in an actualteaching situation when you are an intern, a student teach-er, an inservice teacher, or occupational trainer.

ProceduresModules are designed to allow you to individualize yourteacher education program. You need to take only thosemodules covering skills that you do not already possess.Similarly, you need not complete any learning experiencewithin a module if you already have the skill neede' tocomplete it. Therefore, before taking any modu' youshould carefully review (1) the introduction, (2) the objec-tives listed on p. 4, (3) the overviews preceding each learn-ing experience, and (4) the final experience. After compar-ing your present needs and competencies with the-. informa-tion you have read in these sectic Is, you should be ready tomake one of the following decisions:

That you do not have the competencies indicated andshould complete the entire moduleThat you are competent in one or more of the enablingobjectives leading to the final learning experience and,thus, can omit those Teaming experiences

e That you are already competent in this area and arewady to complete the final learning experience in

order to "test out"That the module is inappropriate to your needs at thistime

When you are ready to complete the final seaming experi-ence and have access to an actual teaching situation,make the necessary arrangements with your resource per-son. If you do not complete the final experience success-fully, meet with your resource person and arrange to (1)repeat the exporience or (2) complete (or review) previoussections of the module or other related activities suggestedby your resource person before attempting to repeat thefinal experience.Options for recycling are also available in each of thelearning experiences preceding the final expenence. Anytime you do not meet the minimum level of performancerequired to meet an objective, you and your resource per-son may meet to select activities to help you reach compe-tency. This could involve (1) completing parts of the modulepreviously skipped, (2) repeating activities, (3) reading sup-plementary resources or completing additional activitiessuggested by the resource person, (4) designing your ownlearning experience, or (5) completing some other activitysuggested by you or your resource person.

TerminologyActual Teaching Situation: A situation in which you areactually working with and responsible for teaching sec-ondary or postsecondary vocational students or other oc-cupational trainees. An intern, a student teacher, an in-service teacher, or other occupational trainer would befunctioning in an actual teaching situation. If you do noth2 d access to an actual teaching situation when you aretaking the module, you can complete the module ,:p to thefinal learning experience. You would then complete thefinal learning expenence later (i.e., when you have accessto an actual teaching situation).Alternate Activity or Feedback: An item that may substi-,tute for required items that, due to special circumstances,you are unable to complete.Occupational Specialty: A specific area of preparationwithin a vocational service area (e.g., the service areaTrade and Industrial Education includes occupational spe-cialties such as automobile mechanics, welding, and elec-tricity.Optional Activity or Feedback: An item that is not re-quired but that is designed to supplement and enrich therequired items in a learning experience.Resource Person: The person in charge of your educa-tional program (e.g., the professor, instructor, administrator,instructional supervisor, cooperatinc supervising/class-room teacher, or training supervisor SI o is guiding you incompleting this module).Student: The person who is receiving occupational in-struction in a secondary, postsecondary, or other trainingprogram.Vocational Service Area: A major vocational field. agri-cultural education, business and office education, market-ing and distributive education, health occupations educa-tion, home economics education, industrial arts education,technical education, or trade and industrial education.You or the Teacher/Instructor: The person who is com-pleting the module.

Levels of Performance for Final AssessmentN/A: The criterion was not met because it was not appli-cable to the situation.None: No attempt was made to meet the criterion, al-though it was relevant.Poor: The teacher is unable to perform this skill or has onlyvery limited ability to perform it.Fair: The teacher is unable to perform this skill in an ac-ceptable manner but has some ability to perform it.Good: The teacher is able to perform this skill in an effec-tive manner.Excellent: The teacher is able to perform this skill in a veryeffective manner.

74

Page 72: Research, in Vocational Education. ISBN-0-89606-237-6 · ed 289 969. author title. institution. spons agency report no pub date note. available from. pub type. edrs price. descriptors.

Titles of the National Center's Performance-Based Teacher Education Modules

Category A: Program Planning', Development, and EvaluationA-1 Prepare foi a Community SurveyA-2 Conduct a Community SurveyA-3 Report the Findings cf a Community SurveyA-4 Organize an Occupational Advisory CommitteeA-5 Maintain an Occupational Advisory CommitteeA-6 Develop Program Goals and ObjectivesA-7 Conduct an Occupational AnalysisA-8 Develop a Course of StudyA-9 Develop Long-Range Program PlansA-10 Conduct a Student Follow-Up StudyA-11 Evaluate Your Vocational Program

Category B: Instructional PlanningB-1 Determine Needs and Interests of StudentsB-2 Develop Student Performance ObjectivesB-3 Develop a Unit of InstructionB-4 Develop a Lesson PlanB-5 Select Student Instructional MatenalsB-6 Prepare Teacher -Made Instructional Materials

Category C: Instructional ExecutionC-1 Direct Field TripsC-2 Conduct Group Discussions. Panel Discussions. and SymposiumsC-3 Employ Brainstorming. Buzz Group. and Question Box TechniquesC-4 P:rect Students in Instructing Other StudentsC-5 Employ Simulation TechniquesC-6 Guide Student StudyC-7 Direct Student Laboratory ExpenenceC-8 Doect Students in Applying Problem-Solving TechniquesG-9 Employ the Project MethodG-10 Introduce a LessonG-11 Summanze a LessonL-12 Employ Oral Questioning TechniquesC-13 Employ Reinforcement TechniquesG-14 Provide Instruction for Slower and More Capable LearnersC-15 Present an Illustrated TalkG-16 Demonstrate a Manipulative SkillC-17 Demonstrate a Concept or PrincipleC-18 Individualize InstructionG-19 Employ the Team Teaching ApproachC-20 Use Subject Matter Experts to Present InformationC-21 Prepare Bulletin Boards and ExhibitsC-22 Present Information with Models. Real Objects, and Flannel BoardsC-23 Present Information with Overhead and Opaque MaterialsC-24 Present Information with Filmstrips and SlidesC-25 Present Information with FilmsC-26 Present Infon ation with Audio RecordingsC-27 Present Information wi h Televised and Videotaped MaterialsC-28 Employ Programmed InstructionC-29 Present Information with the Chalkboard and Flip Chart

Category D: Instructional Evaluation0.1 Establish Student Performance CriteriaD-2 Assess Student Performance KnowledgeD-3 Assess Student Performance' AttitudesD-4 Assess Student Performance Skills0-5 Determine Student Grades0-6 Evaluate Your Instructional Effectiveness

Category E: Instructional ManagementE-1 Project Instructional Resource NeedsE-2 Manage Your Budgeting and Reporting ResponsibilitiesE-3 Arrange for Improvement of Your Vocational FacilitiesE-4 Maintain a Filing SystemE-5 '=:OVIde for Student SafetyE-6 Provide for the First Aid Needs of StudentsE-7 Assist Students in Developing SelfDisciolineE-8 Organize the Vocational LaboratoryE-9 Manage the Vocational LaboratoryE-10 Combat Problems of Student Chemical Use

Category G: School-Community RelationsG-1 Develop a School-Community Relations Plan for Your Vocational ProgramG-2 Grve Presentations to Promote Your Vocational ProgramG-3 Develop Brochur." to Promote Your Vocational ProgramG-4 Prepare Displays to Promote Your Vocational ProgramG-5 Prepare News Rel,,ases and Articles Concerning Your Vocational ProgramG-6 Arrange for Telev,sion and Radio Presentations

Concerning Your Vocational ProgramG-7 Conduct an Open HouseG-8 Work with Members of the CommunityG-9 Work with State and Local EducatorsG-10 Obtain Feedback about Your Vocational Program

Category H: Vocational "nudent OrganizationHA Develop a Personal Ph losophy Concerning

Vocational Student OrganizationsH2 Establish a Vocational Student OrganizationH-3 Prepare Vocational Student Organization Members for Leadership RolesH-4 Assist Vocational Student Organization Members in Developing and

Financing a Yearly Program of Activities11-5 Supervise Activities of the Vocational Student OrganizationH-6 Guide Participation in Vocational Student Organization Contests

Category I: Professional Role and DevelopmentIA Keep Up-to-date Professionally1.2 Serve Your Teaching Profession1-3 Develop an Active Personal Philosophy of Education1-4 Serve the Schcol and Community1-5 Obtain a Sunatle Teaching Position1-6 Provide Laboratory Expenences for Prospective Teachers1.7 Plan the Student Teaching ExpenenceI-8 Supervise Student Teachers

Category J: Coordination of Cooperative EducationJ-1 Establish Guidelines for Your Cooperative Vocational ProgramJ-2 Manage the Attendance. Transfers, and Terminations of Co-op StudentsJ-3 Enroll Students in Your Co-op ProgramJ-4 Secure Training Stations for Your Co-op ProgramJ-5 Place Co-op Students on the JobJ-6 Develop the Training Ability of On-the-Job InstructorsJ-7 Coordinate On-the-Job InstructionJ-8 Evaluate Co-op Studs Its' On-the-Job PerformanceJ-9 Prepare for Students' helated InstructionJ -10 Supervise an Employer-Employee Appreciation Event

Category K: Implementing Competency-Based Education (CBE)K-1 Prepare Yourself for CBEK-2 Organize the Content for a CBE ProgramK-3 Organize Your Class and Lab to Install CBEK-4 Provide Instructional Materials for CBEK-5 Manage the Daily Routines of Your CBE ProgramK-6 Guide Your Students Through the CBE Program

Category L: Serving Students with Special/Exceptional NeedsL 1 Prepare Yourself to Serve Exceptional StudentsL-2 Identity and Diagnose Exceptional StudentsL-3 Plan Instruction for Exceptional StudentsL-4 Provide Appropriate Instructional Materials for Exceptional StudentsL-5 Moody the Learning Environment for Exceptional StudentsL-6 Promote Peer Acceptance of Exceptional StudentsL-7 !se Instructional Techniques to Meet the Needs of Exceptional StudentsL-8 Improve Your Communication SkillsL-9 Assess the Progress of Exceptional StudentsLAO Counsel Exceptional Students with Personal-Social ProblemsL-I1 Assist Exceptional Students in Developing Career Planaing skillsL-12 Prepare Exceptional Students for EmployabilityL-11 Promote Your Vocational Program with Exceptional Students

Category M: Assisting Students in Improving Their Basic SkillsM -1 Assist Students in Achieving Basic Reading SkillsM-2 Assist Students in Developing Technical Reading SkillsM-3 Assist Students in Improving Their Wnting SkillsM-4 Assist Students in Improving Their Oral Communication SkillsM-5 Assist Students in Improving Their Math SkillsM-6 Assist Students in Improving Their Survival Skills

Category N: Teaching AdultsCategory F: Guidance N-1 Prepare so Work with Adult LearnersF , Gather Student Data Using Formal Data-Collection Techniques N-2 Market an Adult Education ProgramF-2 Gather Student Data Through Personal Contacts N-3 Determine Individual Training NeedsF -3 Use Conferences to Help Meet Student Needs N-4 Plan Instruction for AdultsF-4 Provide Information on Educational and Career Opportunities N-5 Manage the Adult Instructional ProcessF-5 Assist Students in Applying for Employment or Further Education N-6 Evaluate the Performance of Adults

RELATED PUBLICATIONSStucknt Guide to Using Performance-Based Teacher Education MaterialsResource Person Guide to Using Performance-Based Teacher Education MaterialsGuide to the Implementation of Performance-Based Teacher EducationPerformance -Based Teachei Education The State of the Art, General Education and Vocational Education

For information regarding availability and prices of these materials contactAAVIM, Amer,can Association for Vocational In.AtructionalMaterials, 120 Driftmier Engineering Center, University of Georgia, Athens, Georgia 30602, (404) 542-2586

75 ISBN 0-89606-2376