ilOCUMENT RESUME - ERIC · in Vocational Education under Federal Number NEC00.3-77. Because this...
Transcript of ilOCUMENT RESUME - ERIC · in Vocational Education under Federal Number NEC00.3-77. Because this...
ED 258 008
AUTHORTITLE
INSTITUTION
SPONS AGENCY
REPORT NOPUB DATENOTE
AVAILABLE FROM
PUB TYPE
EDRS PRICEDESCRIPTORS
ilOCUMENT RESUME
CE 041 674
Pfister, Linda A.Communicate and Use Evaluation-Based Decisions.Module CG E-2 of Category E--Evaluating.Competency-Based Career Guidance Modules.American Association for Counseling and Development,Alexandria, VA.; American Institutes for Research inthe Behavioral Sciences, Palo Alto, Calif.; AmericanVocational Association, Inc., Arlington, Va.;Missouri Univ., Columbia.; Ohio State Univ.,Columbus. National Center for Research in VocationalEducation.Office of Vocational and Adult Education (ED),Washington, DC.ISBN-0-934425-37-Z8540p.; For other modules in the Competency-BasedCareer Guidance Series, see CE 041 641.Bell and Howell Publication Systems Division, OldMansfield Road, Wooster, OH 44691-9050.Guides - Classroom Use - Materials (For Learner)(051)
MF01/PCO2 Plus Postage.Career Counseling; *Career Guidance; *ChangeStrategies; Check Lists; Competency Based Education;Counselor Training; *Decision Making; *EvaluationUtilization; Guidance Objectives; Guidance Personnel;*Guidance Programs; Information Dissemination;Inservice Education; Learning Activities; LearningModules; Lesson Plans; Paraprofessional Personnel;Postsecondary Education; *Program Evaluation;Questionnaires; Research Utilization; SecondaryEducation; Self Evaluation (Individuals)
ABSTRACTThis learning module, one in a series of
competency-based guidance program training packages focusing uponprofessional and paraprofessional competencies of guidance personnel,deals with reporting results of program evaluation and using resultsto make key administrative decisions. Addressed in the module are thefollowiug topics: evaluating results, communicating resultseffectively, deciding how to use evaluation results, and identifyingalternatives for change. The module consists of readings and learningexperiences covering these four topics. Each learning experiencecontains some or all of the following: an overview, a competencystatement, a learning objective, one or more individual learningactivities, an individual feedback exercise, one or more groupactivities, and a facilitator's outline for use in directing thegroup activities. Concluding the module are a participantself-assessment questionnaire, a trainer's assessment questionnaire,a checklist of performance indicators, a list of references, and anannotated list of suggested additional resources. (MN)
11
Communicate andUse Evaluation-BasedDecisions
ua EISPANTIOESIE OP SINISAININNATIONAL INSTITUTE OP EDUCATION
EDUCATIONAL RESOURCES INFORMATIONCENTER (ERIC)
tihe document Me been tegoodocee ee
teamed horn the omen a etbottellonapnea, itMom chaps Ma been meat to nameteLoodatton quaint
Pants of now or oottano Noted On shadow,teens do not meow* apnoea Mktg Mtwagon ot alto
BELL HOWELLPublication Systems Division
Publication Products
2
"PERMISSION TO REPRODUCE THISMATERIAL HAS SEEN GRANTED By
TO THE EDUCATIONAL RESOURCESINFORMATION CENTER IERIC)"
mar Nay r e Gamy - yr/ 91W7 So Os 11Vett/riiw V. 0,11 le Miami"
PLANNING
111 Identity and Planfor Guidance Pro.gram Change
A Omani/. Ovid.ante Program Demi°omen, Team
A4 Collaborate withthe Community
A1 Establish aCareer DevelopmentTheory
A4 Build a Guid-ance Program Plan.Ring Model
A4 Determine Clientand EnvironmentalNeeds
SUPPORTING
04 InfluenceLegislation
04 Wile Proposals
54 Improve PublicRelations and Community Involvement
IMPLEMENTING
04 Conduct StaffDevelopmentActiwties
B Use end Complywith AdministrativeMechanisms
C-1 Counsel loaund-uals and Groups
C2 Tutor Clients
C3 Conduct Coputerind Guidance
C4 Infuse Cirrocutum.Bssed Oineence
C-2 CoordinateCareer ResourceCenters
[C4 Promote Home-BasedGuidence
C.? Develop a WorkExperience Program
C4 Provide forEmployability SkillDevelopment
C-0 Provide for theBasic !bills
C12 ConductPlacement and Refer-ral Activities
C-11 FacilitateFollow-through andFollow-up
C-12 Create and Usean Individual CareerDevelopment Plan
OPERATING
C13 Provide CareerGuidance to Girlsand Women
C-14 EnhanceUnderstanding ofIndividuals withDvsebvbttes
C111 Help EthnicMinorities withCareer Guidance
C111 Meet InitialGuidance Nee* ofOlder Adults
04 Ensure ProgramOperations
04 Aid ProlatuonalGrowth
3
C1? Promote Equityand Client A4lvocacy
C-111 Assist Clientswith Equdy Rightsand Responsibilities
C-10 Develop Ethicaland Legal Standards
EVALUATING
1
1.1 Evaluate Guid-ance Activities
14 COMMOOMOMand UM OVallabOan-llamOlholoions
Communicate andUse Evaluation-BasedDecisions
o
:1,t
L1
t_
1; r.1
Module CG E-2 of Category E EvaluatingCompetency-Based Career Guidance Modules
by Linda A. PfisterHBJ W. ' a SystemsOrlandu, FL
The National Center for Research in Vocational EducationThe Ohio State University1960 Kenny RoadColumbus, Ohio 43210
1985
ISBN 0. 934425-37-X
Copyright 1965 by The National Center for Research in Vocational Education, The Ohio State University. All rights reserved.
These materials were developed by the National Center for Research in Voc ational Education, The Ohio Slate University,Columbus, Ohio, The American Association for Counseling and Development. Alexandria. Virginia: The American VocationalAssociation. Arlington. Virginia; The American Institutes for Research. Palo Alto. California; and the University of Missouri.Columbia. through contracts from the United States Department of Education, ()lice of Vocational and Adult Education; underthe research section of the Educational Amendment of 1916 (P.1. 94.482) Copyright is claimed until full term. Thereafter allportions of this work covered by this copyright will be in the public domain The opinions expressed. however, do not necessarilyreflect the position or policy of the Department of Education. and no official endorsement by the Department of Education shouldbe interred
Published and distributed by Delia Has, z.lo Publication Systems Division.Old Mansfield Road, Wooster, Ohio 44691.9050. 1.800-321-9881 or it Ohio call (216) 264.6666
BELL' HOWELLPublication Systems Diyisitio
Publicalion Products
FOREWORDThis counseling and guidance program series is patternedalter the PerformanceBased Teacher Education modulesdesigned and developed al the National Center for Researchin Vocational Education under Federal Number NEC00.3-77Because this model has been successfully and enthusiasticallyrecieved nationally and internationally. this series of modulesfollows the same basic format
T his module is one of a series of competency-based guidanceprogram training packagesfocusing upon specific professionaland paraprofessional competencies of guidance personnelThe competencies upon which these modules are based wereidentified and verified through a protect study as being thoseof critical importance for the planning supporting implement-ing operating and evaluating of guidance programs Thesemodules are addressed to professional and paraprofessionalguidance program staff in a wide variety of educational andcommunity settings and agencies
Each module provides learning experiences that integratetheory and application each culminates with competencyreferenced evaluation suggestions The materials are designedfor usp by individuals or groups of guidance personnel whoare involved in training Resource persons should be skilled inthe guidance program competency being developed andshould be thoroughly oriented to the concepts and proceduresused in the total training package
The design of the materials provides considerable flexibilityfor planning and conducting competency-based preSery Meand miser vice programs to meet a wide variety of individualneeds and interests The materials are intended for use byuniversities. stale departments of education. postsecondaryinstitutions intermediate eduCational service agencies. JTPAagencies employment security agencies, and other Commu-nity agencies that are responsible for the employment Andprofessional development of guidance pertionnel
T hecompetency-bined guidance program training packagesare products of a research effort by the National Center'sCareer Development Program Area Many individuals. Institu-tions and agencies participated with the National Center andhave made contributions to the systematic development,testing and refinement of the materials
National consultants provided substantial writing and reviewassistance in development of the initial module versions overt300 guidance personnel used the materials to early stages oftheir development and provided feedback to the NationalCenter for revision and refinement The materials have beenor are being used by 57 pilot community implementation sitesacross the country
Special recognition for major roles en the direction develop-ment. coordination of development. testing and revision ofthese materials and the coordination of pilot implementationsites is extended to the following project staff Harry N DrierConsortium Director. Robert E Campbell. Linda Pfister.Directors. Robert Shuman. Research Specialist KarenKimmel Boyle. Fred Williams. Program Associates. and JameB Connell. Graduate Research Associate
Appreciation also is extended to the subcontractors whoassisted the National Center in this effort Drs Brian Jonesand Linda Phillips-Jones of the American Institutes forResearch developed the competency base for the total pack-age managed project evaluation, and developed the modulesaddressing special needs Gratitude is expressed to DrNorman Gysbers of the University of MissouriColumbia forhis work on the module on individual career developmentplans Both of these agencies provided coordination andmonitoring assistance for the pilot implementation sitesAppreciation is extended to the American Vocational Associ-ation and the American Association toi Counseling andDevelopment for their leadership in directing extremely impor-tant subcontractors associated with the twat phase of thiseffort
The National Center is grateful to Mt U S Department ofEducation. Oftice of Vocational and Adult Education IOVAE)for sponsorship of threeContreets related to thiscompetency-based guidance program training package In particular. weappreciate the leadership and support olfered project stall byDavid H Pritchard who served as the project officer for thecontracts We feel the investment of the OVAE in thiS trainingpackage is sound and will have lasting effects in the field ofguidance in the years to come.
Robert E TaylorExecutive Director
National Center for Researchin Vocational Education
THE NATIONAL CENTER
FOR RESEARCH IN VOCATIONAL EDUCATIONNE OHIO SLATE ulizvERSitv1960 KENNY 01010. co.uwauS OHIO 43210
The National Center for Research in Vocational Education's rms.soon is to increase the ability of diverse agencies, institutions. andorganizations to solve educational problems relating to individualcareer planning. preparation. and progression The National Centerludas its mission by
Generating knowledge through researchDeveloping educational programs and productsEvaluating individual program needs and outcomesProviding information for national planning and policyInstalling educational programs and productsOperating information systems and servicesConducting leadership development and training programs
BELL-HOWELLPublication Systems WSWPublication Products
Bell & Howell, Publication Products. is one of two operating unitsthat comprise Publication Systems Division Based in Wooster,Ohio. P.IbliCatiOn Products specializes in the production and repro.Chinon of newspaper,. periodicals, indesell. career informationmaterials and other widely used information sources in microform.hard copy and electronic media
2
ABOUT THIS MODULE
Goal. . . . . .
Aftercompleting this module, career guidance per-COMMUNICATE AND USE ',Bonne! will be betterable to report and diaseminataEVALUATION,BASED the reaultsotprogramialuatiOn:ACtivitles to *aPpro- .
DECISIONS prititi.audientet *and :to .useilialUatiOn results- tomake- key administrative dieitiOns* about thegrain and .Na future direCtion,
INTRODUCTION 5
READING 7
Competency 1. Clarify the decisions to which evaluation data may contribute and thecharacteristics and needs of various audiences interested in the evaluation results 7
Competency 2. Choose a content, format, and level of sophistication for the evaluationreport or presentation that is appropriate to the characteristics of the target audiences andtheir decision needs 8
Competency 3. Develop rules to be followed in making program decisions, and clarifydecisions to which formative and summative evaluation data may contribute 9
Competency 4. State the alternatives to be considered in making program decisions. use thedecision making rules and evaluation results to consider the various alternatives, and selectthe most beneficial alternatives for the program 10
LEARNING EXPERIENCES
1 Evaluating Results. Who Needs Them and Why 11
2 Communicating Results Effectively 153 Deciding How to Use Evaluation Results 194 Identifying Alternatives for Change 23
EVALUATION 27
REFERENCES 35
3
ABOUT USING THE CBCG MODULESCOCCI Module Organisation
The training modules cover the knowledge. skills. andattitudes needed to plan. support, implement, operate. andevaluate a comprehensive career guidance program. Theyare designed to provide career guidance program irnpie mentors with a systematic means to improve theircareer guidance programs. They are competency-basedand contain specific information that is intended to assistusers to develop at least part of the critical competenciesnecessary for overall program improvement.
These modules provide information and learning ac-tivities that are useful for both school-based andnonschool-based career guidance programs.
The modules are divided into five categories.The GUIDANCE PROGRAM PLANNING category assistsguidance personnel in outlining in advance what is to bedone.The SUPPORTING category ossists personnel in know-ing how to provide resources or means that make it possi-ble for planned program activities to occur.The IMPLEMENTING category suggests how to conduct.accomplish, or carry out selected career guidance programactivities.The OPERATING category provides information on howto continue the program on a day-to-day basis once it hasbeen initiated.The EVALUATING category assists guidance personnelin judging the quality and impact of the program and eithermaking appropriate modifications based on findings ormaking decisions to terminate It.
Module Format
A standard format is used in all of the program'scompetency-based modules. Each module contains (1) anintroduction, (2) a module focus, (3) a reading, (4) learn-ing experiences, (5) evaluation techniques, and (6)resources.
Introduction. The introduction gives you. the moduleuser. an overview of the purpose and content of themodule. It provides enough information for you to deter-mine if the module addresses an area in which youneed more competence.About This Module. This section presents the follow-ing information:
Module Goal: A statement of what one can ac-complish by completing the module.Competencies: A listing of the competencystatements that relate to the module's area of con-cem. These statements represent the competenciesthought to be most critical in terms of difficulty forinexperienced implementers, and they are not anexhaustive list.
This section also serves as the table of contents for thereading and learning experiences,Reading. Each module contains a section in whichcognitive information on each one of the competenciesis presented.
1. Use it as a textbook by starting at the first page andreading through until the end. You could then
complete the learning experiences that relate tospecific competencies. This approach is good if youwould like to give an overview of some competen-cies and a more in-depth study of others.
2. Turn directly to the learning experiences(s) thatrelate to the needed competency (competencies).Within each learning experience a reading is listed.This approach allows for a more experiential ap-proach prior to the reading activity.
Learning Experiences. The learning experiences aredesigned to help users in the achievement of specificlearning objectives. One teaming experience exists foreach competency (or a cluster of like competencies), andeach learning experience is designed to stand on its own.Each learning experience is preceded by an overviewsheet which describes what is to be covered in the learn-ing experience.Within the body of the learning experience, the followingcomponents appear.
Individual Activity: This is an activity which a personcan complete without any outside assistance. All of theinformation needed for its completion is contained inthe module.individual Feedback; After each individual activitythere is a feedback section. This is to provide userswith immediate feedback or 'valuation regarding theirprogress before continuing. The concept of feedbackis alio intended with the group activities, but it is builtright into the activity and doss not appear as a separatesection.
Group Activity: This activity is designed to befacilitated by a trainer, within a group training session.
The group activity is formatted along the lines of afacilitator's outline. The outline details suggested ac-tivities and information for you to use. A blend ofpresentation and "hands-on" participant activities suchas games and role playing is included. A Notes columnappears on each page of the facilitator's outline. Thisspace is provided so trainers can add their own com-ments and suggestions to the cues that are provided.
Following the outline is a list of materials that will beneeded by workshop facilitator. This section can serveas a duplication master for mimeographed handoutsor transparencies you may want to prepare.
Evaluation Techniques, This section of each module con-tains information and instruments that can be used tomeasure what workshop participants need prior to train-ing and what they have accomplished as a result of train-ing. Included in this section area Pre- and Post-ParticipantAssessment Questionnaire and a Trainer's AssessmentQuestionnaire. The latter contains a set of performanceindicators which are designed to determine the degree ofsuccess the participants had with the activity.References. All major sources that were used to developthe module are listed in this section. Also, major materialsresources that relate to the competencies presented in themodule are described and characterized.
The basic purpose of evaluation is to provideinformation for planning and future action ordecisions. A program evaluation seeks to deter-mine whether or not the program activities havemet their objectives. Practically, this means. "Didyour clients learn what you intended them tolearn?" Realistically, these results often havesubstantial impact on both the direction and Weof your program. And in this age of accountabil-ity. many audiences are interested in the findings.These audiences range from your clients andparents or guardians (where appropriate) to tax-payers and legislatorsparticularly if your pro-gram is being supported with state or federalfunds.
Evaluation is a key component in any career guid-ance program. not an add-on. Although thismodule and the module CG E-1, Evaluate Guid-ance Activities. appear to complete the series.they are topics and competencies that must be
4
5
INTRODUCTION
undertaken as you plan your program. The care-ful setting of objectives, making sure that they areboth important and measurable, will ensure thatyou obtain information to refine and improveyour efforts.
Communicating clearly is important in all aspectsof program development and implementation, Itis particularly essential to keep people up to datewith the progress your program is making inreaching its goals. This module focuses on howto communicate this progress as well as how touse evaluation findings to improve your program.It ties not only with the module CG E-1, EvaluateGuidance Activities, but also with the first moduleof the series, CG A-1, Identity and Plan for Guid-ance Program Change. thus illustrating the cycli-cal nature of program development andimpiementation,
READING
Evaluating Results: Who Needs Them and Why
fir
There are many people who believe that evalua-tion reports are never read--only filed. Unfortu-nately, it does seem that all too often an evalua-tion is conducted to meet the demands of afunding agency and critics justifiably claim thatthe report is simply a product to meet contractrequirements. However. with increasing demandson limited funds and heightened citizen concernfor use of tax monies. more and more people areinterested in whether or not the program that youprovide has a positive impact on your clients.
As you plan to communicate evaluation results toyour constituents, you need to identity the var-ious audiences and determine what informationthey need for decision making. If you are receiv-ing external funding for your program, thesesponsors are. of course, a key audience. But. whoelse needs information? Whatever the structureof your agency is, certain administrators will needevaluation information. If you have an advisorycommittee (several members of whom mightserve as an evaluation team). you will want tokeep them advised of both progress and prob-lems As both staff members and clients willundoubtedly be involved in any modification ofthe program - -as well as possible providing thebasic evaluation data--they should also be viewedas key audiences One often overlooked group isthe community The public is more interestedthan we often credit them for in the results of ourprograms. It is often their tax dollars and their
7
education or that of their children that are affected.In addition, many career guidance programs relyon the participation of the community to be trulyeffective. Depending upon the impact of the eval-uation there may be additional audiences such aslegislative leaders or board of education membersthat need to be advised of the findings. Also, besensitive to any "right-to-know" audiences--thatis. anyone who would be affected by the results.
Evaluation data are generally used to make one orboth of the following decisions:
1. How can this program be improved?
2. Should this program be continued?
The first question is usually the focus of a forma-tive evaluation: that is, a study of the process ofprogram implementation. Such data are of par-ticular use to program administrators as theywork to improve particular components of theprogram. The second question is, of course. moreconsequential to the life of the program. Thisdecision can be made by data gathered through asummative evaluation study. Summative evalua-tions are results oriented and seek to find theeffects that program efforts have on clients. Avariety of audiences are interested in these find-ings because the resulting decision defines theworth of the program.
9
Communicating Results Effectively
,
, ;),.;.". .
The importance of being sensitive to the individ-uality of your audiences is easy to acknowledgebut somewhat more complex if you intend tocarry it out fully. There are many things to con-sider. how much trIsiblr:y should this programevaluation receive: how many different audiencesmust the results reach; how much time andmoney should be devoted to disseminating theresearch findings before beginning to make pro-gram changes? In essence, communicating eval-uation results is important; however, the extent ofthe dissemination process must be thought throughcarefully
Your comm unication strategies will differ depend-mg upon the type of evaluation results you haveto report Formative evaluation results whichaddress program components on an ongoingbasis often are directed toward fewer audiencesthan are the sum mative evaluation results and areusually presented more informally and providesuggestions that are useful primarily to programstaff Summative evaluations, however, often areof interest to a wider variety of audiences as theyreflect end-of-program results. The reporting ofthese results can take a variety of forms, butshould include the following elements:
1 Introduction A statement of purpose and abrief description of the program should pro-vide sufficient background for the reader tomake use of the study's recommendations.
2 Recommendations. The reader should knowwhat recommendations have been madeprior to reading the details of the study. Thisis the most important section of the report.Recommendations should be clear and con-cise and directly related to the findings ofthe study That is. if there are five majorrecommendations. there should also be fivesets of findings to support those recom-mendations From this. program staff andothers can suggest methods to act on therecommendations
8
3. Evaluation Design and Procedures. Thissection should describe the design, theinstrumentation. data sources, and any lim-itations within the study Although this partof the report will undoubtedly be quite tech-nical, present your techniques as simplyand concisely as possible. There are alwaysseveral approaches that could be used indesigning an evaluation study. Bethorough,but don't "over justify" your design.
4. Findings. Use simple charts and tables 'opresent your results. As you are reportingresults--not making recommendations--refrain from explaining the results. Also.present your data in the most useful waypossible. For example. use percentages suchas "85 percent agree and 15 percent dis-agree" as opposed to using raw numbers.
Some evaluators believe that the same reportshould be provided to all audiences with an exec-utive summary listing key findings and recom-mendations for those who are either not inter-ested or sufficiently skilled to analyze the fullreport. This approach minimizes the expense ofcommunication and avoids any misinterpretations that staff are providing different informationto different audiences. Another approach- -morewidely approved--is to develop a different report.as well as a different style of presentation. foreach audience. In adopting this approach. youshould receive more attention from the audiencesyou wish to reach.
If you decide that your evaluation report shouldreceive wide dissemination and that the implica-tions of the findings and recommendations meritinteraction with various audiences. you may wishto use means ether than written reports to reacheven more people. Use of the news media- -newspapers. radio. and television -- provides im-mediate visibility. This type of coverage can serveas a lead-in to a series of meetings to discuss thestudy in detail
10
Deciding How to Use Evaluation Results
The true impact of an evaluation depends onwhether or not the results are used to improveprograms. A great deal of data are generated inmost evaluations--some positive. some negative- -and it may be your job to sort out the informationand make sense of it. Although the data reveal toyou how clients perform, you and others have toset the standards to interpret the results accu-rately For example. if you use portions of instru-ments to measure how well your clients achieveda particular objective, what percentage of achieve-ment will prove attainment--50 percent, 75 per-cent. 90 percent? In short. how well do yourclients have to perform in order for you to besatisfied that they have met the program goals?
It is likely that as you examine the data. you willfind that some objectives were clearly met. somewere clearly not met, and some fall in between. Todeal with these sorts of mixed findings, this set ofguidelines can be used to analyze the results;
1. Examine all of the program objectives andseparate your findings into two areas: thosethat met the criteria for success and thosethat did not.
2 Analyze the program objectives that weremet Are there clues as to why they wereachieved? Could activities be improved toeither reach more clients or be more cost -effective?
9
11
3. Analyze the program objectives that werenot achieved. Are these objectives realistic?Should they be revised or discarded? Or,should other activities be undertaken tomeet the objectives?
4. Determine a revised set of objectives, per-haps including some objectives that werenot reached.
5. Plan activities which are likely to achievethedesired outcomes.
Using this framework. or a similar one. can helpyou in analyzing results from either formative orsummative evaluations as the process for makingdecisions does not change. Decisions resultingfrom formative evaluations are incorporated intothe program as it is developing. If it is discoveredthat particular objectives are not being achieved,activities may be changed or new ones imple-mented. Decisions made from sum mative evalua-tions usually involve more extensive modifica-tions such as continuing or discontinuing certainprogram objectives.
As you begin planning for program modifica-tions, be sure to look beyond the recommenda-tions listed in the report. Evaluate the evaluationitself to make certain that the decisions you makefor change have sufficient data to support theactions you take.
Identifying Alternatives for Change
fj
.; .. .
I:fg*Iy.Uto be considered hi 'milking.uile'thil decision-mskirO. MIS . .
Ni lo tiOnsidiiikille viniowalt06: <,:,fths.:,nril.ashefigild tilterniijo*
rem.
It is rare to ever find an evaluation report that doesnot make recommendations for program improve-ment This does not mean that most programs areof poor quality nor does it reflect on the useful-ness of evaluation. but rather on the vast numberof objectives that are present in most programs- -and the near possibility of perfection'
When conclusions are offered as to programeffectiveness. there ,s a variety of alternatives thatare available to improve the effort. Even if a pro-gram meets all of its objectives, alternatives mayneed lobe examined to see if there are other waysto meet the objectives and yet reach additionalclients--perhaps without using additional funds.
On the other hand. if a program meets none of itsobjectives. there are alternatives to considerbesides discarding the entire effort. Would theprogram have worked if there had been additionalresources to support it? Were the objectivesattainable or. with revisions. are they stillimportant/
Often the objectives we establish for programsare a setup for failure because they are so idealis-
tic that they are nearly impossible to attain, Theyshould be reexamined before final decisions aremade as to the future of the program.
What typically appears in evaluation findings issomething less dramatic than total success ortotal failure. Regardless of the positive or nega-tive finding, you should examine each objectiveand the activities undertaken to achieve it as out-lined in the previous section. Analyze the datathoroughly before generating alternatives forchange. In addition. it is important to analyze howready your organization is to undertake change.If resistance is likely to be met regardless of thecontent of the change, work must begin there.You cannot implement change within your agencyby yourself or with a small staff. It takes the com-mitment of the administration and staff, and thisis more likely to occur by involvement rather thanmandate. In essence, the same process that youundertook to initiate the programplanning,supporting. implementing, operating, andevaluating--must once again be undertaken asyou make program changes.
10
12
r41/111, ,.. ...WWI* milm.11.11 .1.11.11.01111*
Learning Experience 1Evaluating Results: Who Needs Them and Why
OVERVIEW
COttiPETENCY Cisrify.thedetliibais to whii1,.evaluation data may contribute, . iotd.tittithltraidterlsticiLand needs of various audiences inter
.'. *Mit IV't the; eriV01.11 Maul".. I., *,
.
READitfe.' ftelici'compfteptyl: OA page 7.
INDIVIDUALACTIVITY
. .
Jet the illudieroces to be serveclihrough evaluation efforts,
INDIVIDUAL identify the venous audiences to whom you plan .to distribute,
FEEDBACK your evaluationlirfdings.
INDIVIDUALLEARNINGOBJECTIVE
GROUPLEARNINGOBJECTIVE
GROUPACTIVITY
Compare Your ideas with notes provided.
Verbalize the potential impact that evaluation results have onvarious decisionmaking audiences,
Identify decisions that will be made as a result of a programevaluation.
1 1
INDIVIDUALACTIVITY
`'Identify the various Audiences to whom you pion to distributeYOU, MmiustiOn:findiegs. .
Io
Review the reading for Competency 1 on page 7, and then complete the following activity:
As director of your agency's career guidance program, you have decided that-after 3 years--it isessential that a comprehensive sum mative evaluation be performed. Your program's efforts have beenguided by detailed objectives ant' formative evaluations have been conducted periodically.
Your program has had high visibility both inside and outside your agency. List those audiences whomyou would communicate findings and rank them in order of importance.
Audiences Rank
..11
.11..,,i ..
12
INDIVIDUAL :FEEDBACK CimPariYouridm.
Did you list the following audiences?
Yes No
1. Funding Sponsor2. Advisory Committee3. Board of Education/Board of Directors4. Advisory Committee5. Staff (including adminstrators)6. Clients7. Parents or Guardians (if appropriate)8. Community Leaders9. Legislators
10. Professional Colleagues
In ranking for importance, there is one "right" answer. However, your listing will indicate your ownvalues and may reflect the audience(s) to whom you are most responsive.
GROUPACTIVITY
. Identity decisions that will be SOO III I.*valuation.
Note: The following outline is to be used by the workshop facilitator.
Facilitator's Outline Notes
A Introduction
1. Indicate that this activity is designedto aid participants in identifyingdecisions made by various groupsas a result of program evaluationdata.
13
15
Facilitator's Outline Notes
2 Explain that this activity requiresparticipants to role play in smallgroups.
3. Have participants review the read-ing for Competency 1 on page 7.
B. Process
1. Ask participants to break into threegroups.
2. Assign each group a simulated iden-tification label such as: (1) agency.(2) staff. or (3) community. Ask par-ticipants to assume the roles ofrepresentatives of these audiences.
3. Have participants assume that theyhave received an interim evaluationof a career guidance program indi-cating positive evidence of changein the types of learning activitiesprovided for clients. but no evidenceof significant change in the clients'knowledge or skills.
4. Have each group list ways in whichtheir decision-making roles couldimpact the future of the program.
5. Reassemble as a total group andshare findings. comparing and con-trasting decision-making powers ofeach group.
Encourage cis 4...ups to go beyond obvious actionssuch as eliminating funding to listing positiveactions they could take.
14
16
Learning Experience 2Communicating Results Effectively
OVERVIEW
COMPETENCY Choose a content, format, and level of sophistication for theevaluation report or presentation that is appropriate to theCharacteristics of the target audiences and their decisionneeds.
READING ftead'ComPetency.2 on page 8.
INDIVIDUALACTIVITY
INDIVIDUALLEARNINGOBJECTIVE
GROUPACTIVITY
GROUPLEARNINGACTIVITY
GROUPACTIVITY
Design* riding form to evaluate the content and format of yourevaluation report.
,
Prepare a rating form to evaluate the content and format ofyour evaluation effort.
Check your form with the sample provided.
Design a plan for communicating your evaluation results..
Discuss various approaches for communicatin ) your evalua-tion results.
i5
formform to evaluate the content and format ofevaluation report.
As you review the reading for Competency 2 on page 8, you will note that your basic elements of areport are listed.
1. Introduction2 Recommendations3. Evaluation Design and Procedures4. Findings
Prepare a form that lists the criteria, components. etc., you would look for within each element of a goodevaluation report. In other words what would be the review criteria you would use to measure thequality of a report.
16
S
INDIVIDUALFEEDBACK
As you compare your form to the sample provided. note that by using the categories "well stated.""needs better statement." and "not stated," you can evaluate not only whether or not an element wasincluded. but also how well it was presented.
Sample Evaluation ReportRating Form
Introduction
1. Purpose of study
2. Program objectives
3 Program activities
4. Client description
Recommendations
1. Related to findings
2 Viability
Evaluation Design and Procedures
1 Framework explanation
2 Data collection procedures
3 Analysis process
Findings
1 Data presentation
2 Accomplishment of objectives
3 Outcome. summary
NeedsWell Setter NotStated Statement Stated
17
19
Note: The following outline is to be used by the workshop facilitator.
Facilitator's Outline Notes
A Introduction
1. Explain that the purpose of thisactivity is to discuss effective waysto communicate evaluation findings.
2 Discuss with participants items torconsideration when disseminatingfindings, e.g.. number of audiences,time and funds available, need forvisibility. Use the reading for Comp-etency 3 on page 9, as a basis for thediscussion.
B. Process
1. Have participants list possible dis-semination techniques (e.g., formalreport. executive summary, separatereports. news releases. audiovisualpresentations).
2 Ask participants to discuss strengthsand weaknesses of each strategy.
3 Have participantswrite brief outlinesof their preferred communicationdesign.
4. Ask participants to share the ration-ale for their decisions.
18
20
Learning Experience 3Deciding How to Use Evaluation Results
OVERVIEW
COMPETENCY. .Develop.ruiektobi foilowealn making program decisions, and°Wily decisions to which formitive,and suMmatIve evaluation
.., data,:rrMy Contribute,' ... .- . ,... :
READING
INDIVIDUALLEARNINGOBJECTIVE
. . .
Determine an aPpropriate proCeie*to analyze ,p?ogratritVal 00-'thin' retults.:: .
INDIVIDUALACTIVITY
INDIVIDUALFEEDBACK
GROUP.LEARNINGOBJECTIVE
GROUPACTIVITY
.
Design_. ,
. . s a 'method by:which. a group can exaM160.1iValuation.restilts and 'make decisions.- . .
. ,,. .. .
CoMpareyotir ideas with.theixtesind sample chariprovided:
. ,
,..
Discuss the utility of fOrniatiiioapd summitive evaluation datain making prow* deCisions. . :
Brainstorm questions' that can and cannot be answered as aresult of formative and summative evaluations,
19
ion a method by which 'a group can examine evaluationMaultivand maks program decisions: ,
Before you start this activity. review the reading for Competency 3 on page 9. You have been assignedthe responsibility of chairing a task force of agency and community representatives whose purpose isto make recommendations for improving your agency's career guidance program. A summativeevaluation has recently been completed, and you and other members of the task force have received thecomplete report. How would you plan to lead the group in analyzing the results? Remembernot allotyour task force members are educators, nor are all educators evaluation experts.
.
20
24
1
INDIVIDUALFEEDBACK
It is important that you make it as easy as possible for people to use the data listed in the report. Therecommendations should serve as support but should not limit the group's actions.
As was listed in the reading. look initially at the objectives that were accomplished and those that werenot. From that point. determine what activities affected the achievement or lack of such before havingthe group suggest changes. If the process is carried out in a step-by-step manner. confusion- andoverreactionshould be kept to a minimum. A sample analysis chart follows.
Career Guidance Activities EvaluationAnalysis Chart
Grade Level Teacher/Counselor Career Development Area
Age Level
Developmental Goal
Time Cost
No. ofResourcePerson:
No ofClients
ActivityDates
CareerDomain:Develop-mentNeeds Comment:
Objective 1
, .
MeasurementDate
Degree ofSuccess
Activity 1.. --.
Activity 2.
Activity 3.
21
23
Note: The following outline is to be used by the workshop facilitator.
Facilitator's Outline Notes
A. Introduction
1. Explain that the purpose of thisactivity is to aid participants indetermining types of questions thatcan and cannot be answered throughformative and summative evaluations.
2. Indicate that this activity will requireparticipants to be involved in brain-storming in one of two groups.
3. Have participants review the read-ing for Competency 3 on page 9.
B. Process
1. Divide the participants into groups- -Formative and Summative.
2. Have each group brainstorm ques-tions that can be answered in usingthat particular type of evaluation.
3. Atter each group has a list of ques-tions, have them list the types ofdata that would need to be gatheredto answer each question
4. Reconvene as a total group andshare problems that occurred in theprocess, having members of theopposite group assist in determin-my ways to obtain data for theunanswered questions.
You may wish to note that some ot the samequestions may be raised by both groups.
Inform participants that they can eliminatequestions for which they have no feasible wayof gathering data.
22
2
Learning Experience 4Identifying Alternatives for Change
OVERVIEW
COMPETENCY State the alternathies to be considered in making programdecisions, usethe.decisionmaking-rules and evaluation resultsto consider the :various alternatives. ant:110W the most.benefinial alternativeS for' the program.
READING Read Competency 4 on page 10.
INDIVIDUALLEARNINGOBJECTIVE
State alternatives to be considered in making.deCisionaabout..program modifications.
. .
INDIVIDUAL List concerns:and alternatives regagling ppigrain Changes' to
ACTIVITY be considered in expanding your.program
INDIVIDUALFEEDBACK
GROUPLEARNINGOBJECTIVE
GROUPACTIVITY
Compare your notes with the statements provided.
Provide a rationale supporting your recommendation to con-tinue or discontinue a program.
Analyze results of program evaluations and make decisions astc whether or not they should be continued.
73
,poi oorns and alternatives regarding program changes totionilideredfin expanding your, program. ,
Review the reading for Competency 4 on page 10. You have just received the good newsthe positiveevaluation results of your "model" program have reached your chief executive, and you have beengiven the responsibility to expand your program - -as is-- throughout your agency within the next 6months. What immediate concerns arise in your mind, and what alternatives do you visualize that mightalleviate them9
24
26
Regardless of the type of program or the client group(s) it serves, certain concerns probably come toyour mind. As tnere was no mention of increased funding in the plans to expand your program, this maybe your first concern cven if a substantial amount is allocated, it is likely that you will not have the samedollars per client to use as you did in your model program. Before attempting to replicate your program,you might be advised to look at ways in which activities can be accomplished at a lower cost.
No mention was maoe of the objectives in your program that were not met--and there undoubtedly weresome. Analyses of these objectives and their related activities should be undertaken at once beforeexpansion begins.
You also may have concerns about implications for increased staffing needs, additional materials toreach a larger range of clients, and other problems that arise when providing a program that serves awide range of audiences. Advise othersof these implications, too, so that a later evaluation can reflectthe major changes that have been undertaken.
Note: The following outline is to be used by the workshop facilitator.
Facilitator's Outline Notes
A. Introduction
1. Indicate that the purpose of thisactivity is to apply participants'learnings to "real-life" situations.
2. Explain that the success of thisactivity depends upon the involve-ment and sharing of group members.
Facilitator's Outline Notes
3. Have participants review the read-ing for Competency 4 on page 10.
8. Process
1. Ask for volunteers to share expe-riences of program evaluations andthe decision alternatives that wereavailable as a result.
2. Invite participants to ask questionsof the "presenter" to gain additionalinformation as to the details of theprogram and the evaluation process.
3. Have participants vote as to whetheror not they would continue theprogram.
4. Ask participants to justify their deci.sions. and invite the "presenter" toreact.
5. Repeat the process with two to threeadditional presentations.
You may wish to provide the first example as amodel.
26
EVALUATION
PARTICIPANT SELF-ASSESSMENT QUESTIONNAIRE
1 MM. tOptionati
POciillift Titifb
Agency Setting (Circle the appropriate number)
3 Dale
4 Module Number
6 Eienienlaty School 10 JTPA 14 Youth Services 18 Municipal Office7 SPCMIllialy School 11 veterans 15 Bosinessindustry 19 Service Organization
Pt:1- esecontlaty School 1? Church Management 20 Stale GovernmentCollege University 13 Corrections 16 Business industry Labor 21 Other
17 Parent Group
Workshop Topics
PREWORKSHOP NEED FOR TRAIN-ING Degree of Need (circle one foreach workshop topic)
e4qvtt t? 47 44'
POSTWORKSHOP MASTERY OFTOPICS Degree of Mastery (circleone for each workshop topic).
gir0*I IP /1 Listing the audiences to beserved through
evaluation efforts for disseminating eval-uation reports to a variety of audiences.
2 Verbalizing the potential impact that eval-uation results have on various decision-making audiences.
3 Designing a rating form for ev^,uating theevaluation report.
4 Designing a plan for communicating eval-uation results.
5 Determining an appropriate process toanalyze program evaluation results.
6 Discussing the utility of formative andsummative evaluation data in makingprogram decisions.
7 Stating alternatives to be considered inmaking decisions about programmodifications.
8 Providing a rationale supporting yourrecommendation to continue or discon-tinue a program
Overall Assessment on Topic of Communi-cate and Use Evaluation-Based Decisions
0 1 2 3 4 0 1 2 3 4
0 1 2 3 4 0 1 2 3 4
0 1 2 3 4 0 1 2 3 4
0 1 2 3 4 0 1 2 3 4
0 1 2 3 4 0 1 2 3 4
0 1 2 3 4 0 1 2 3 4
0 1 2 3 4 0 1 2 3 4
0 1 2 3 4 0 1 2 3 4
0 1 2 3 4 0 1 2 3 4
Comments:
28
30
Trainer's Assessment Questionnaire
Trainer .......... - ----- - .. _ .. Date. Module Number:
Title of Module:
Training Time to Complete Workshop. hrs. min.
Participant Characteristics
Number in Group ._ .._______ Number of Males Number of Females
Distribution by Position
. __ ___ _ __ Elementary School Youth Services
. Secondary School Business/Industry Management
_ __ Postsecondary School Business/Industry Labor
. College/University Parent Group
_._. JTPA Municipal Office
... _____ . Veterans Service Organization
...._ ..._... _____ Church State Government
Corrections Other
PART 1
WORKSHOP CHARACTERISTICSInstructions: Please provide any comments on the methods andmaterials used, both those contained in the module and others that are not listed. Also provide anycomments concerning your overall reaction to the materials, learners' participations or any otherpositive or negative factors that could have affected the achievement of the module's purpose.
1. Methods: (Compare to those suggested in Facilitator's Outline)
2 Materials: (Compare to those suggested in Facilitator's Outline)
3 Reaction: (Participant reaction to content and activities)
29
31
PART II
WORKSHOP IMPACT instructions: Use Performance indicators to judge degree of mastery. (Com-plete responses for all activities. Those that you did not teach would receive 0.)
Group's Degree of Mastery
Nol Little Some Good OutstandingTaught (25% or less) (211%50%) (51%75%) lower 75%)
Note: Circle the number that best reflects your opinion of group mastery.
Learning Experience IGroup 0 I 2 3 4Individual 0 I 2 3 4
Learning Experience 2Group 0 1 2 3 4Individual 0 1 2 3 4
Learning Experience 3Group 0 1 2 3 4Individual 0 1 2 3 4
Learning Experience 4Group 0 1 2 3 4Individual 0 1 2 3 4
Code:
Little: With no concern f or time or circumstances within training setting il it appears that lesS than25%ot the learnersachieved what was intended to be achieved
SWINK With no concern for time or circumstances within the training setting if I appears that less than close to halt ofthe learners achieved the learning experience
Good: With no concern for time or circumstances Within the training setting it it appears that 50% -75'4 have achievedas expected
Outstanding: tf more than 75% of learners mastered the content as expected
30
..
PART III
SUMMARY DATA SHEETInstructions: In order to gain an overall idea as to mastery impactachieved across the Learning Experiences taught, complete the following tabulation. Transfer thenumber for the degree of mastery on each Learning Experience (i.e., group and individual) from theWorkshop Impact form to the columns below. Add the subtotals to obtain your total module score.
GROUPLearning Experience
1 score (1-4)
2 : score (1-4)
3 score (1-4)4 , score (1-4)
Total(add up)
INDIVIDUALLearning ExperienceI = score (1-4)
2 z score (1-4)3 = score (1-4)4 .= score (1-4)
Total(add up)
Total of the GROUP learning experience scores and INDIVIDUAL learning experience scores...._____. Actual Total Score Compared to Maximum Total*
'Maximum total is the number of learning experiences taught times four (4).
NOTES
.._.._...._.
32
34
Performance Indicators
As you conduct the workshop component of this training module. the facilitator's outline will suggestindividual or group activities which require written or oral responses. The following list of performanceindicators will assist you in assessing the quality of the participants' work:
Module Title: Communicate and Use Evaluation-Based Decisions
Module Number: CG E-2
Group Learning Activity Performance Indicators to Be Used for Learner Assessment
Group Activity Number 1:
Identify decisions that will be made 1. Can participants list three audiences for evaluation reports.as a result of a program evaluation. Funding agency
StaffCommunity
2. List kinds of things each group would like to know as aresult of an evaluation.
3. List two decisions which each group might make as a resultof an evaluation report.
Group Activity Number 2:
Discuss various approaches forcommunicating evaluation results.
1. List items to be considered prior to disseminating evalua-tion findings.
Number of audiencesTime and funds available
2. List dissemination techniques.Formal reportExecutive summaryNews releasespresentations
Group Activity Number 3:
Brainstorm questions that can andcannot be answered as a result offormative and summativeevaluations.
1. Can participants articulate the difference between forma-tive and summative evaluation.
2. Examine list of questions which formative evaluation answers.
3. Examine list of questions which summative evaluationanswers.
4. Can the participants identify the data sources needed toanswer the above questions?
33
35
Group Learning Activity Performance Indicators to Be Used for Learner Assessment
Group Activity Number
Analyze the results of programevaluations and making decisionswhether or not they should becontinued
1. Given an evaluation report, the group should be able toidentify program strengths and weaknesses.
2. Follow the logic to see if solid prograr modifications canbe suggested as a result of evaluation report.
3. Should certain components be discontinued? Which ones?On what basis?
3436
REFERENCES
California State Department of Education. "Con-ducting Evaluation." A Planning Model forDeveloping A Career Guidance Curriculum.Sacramento: California State Department ofEducation. 1978.
Roberts. Sarah. "Communicating EvaluationResults." Module 12 of Developing CareerGuidance Programs. Palo Alto. CA: AmericanInstitutes for Research. 1975.
Davis. Helen M.. and Drier, Harry N. Deciding ViaEvaluation. Columbus. OH: The Center forVocational Education, The Ohio State Univer-sity. 1977
Young, Malcolm B. and Schuh. Russell G. Evalua-tion and Education Decision-Making: A Func-tional Guide to Evaluating Career Education.Development Associates. Inc.. 1975.
Wolman. Jean. "Conducting Product Evaluations."Module 11 of Developing Comprehensive CareerGuidance Programs. Palo Alto, CA: AmericanInstitutes for Research. 1975.
ADDITIONAL RESOURCES
"Communicating Evaluation Results." Module 12of Developing Career Guidance Programs. SarahRoberts. American Institutes for Research. 1975.70 pages.
This competency-based module is the final com-ponent of an extensive series focusing on thedevelopment of sound career guidance programs.It contains learning experiences that allow it to beused as a self-instructional tool or within a groupsetting.
Deciding Via Evaluation. Helen M. Davis andHarry N. Drier. The Center for Research in Voca-tional Education. The Ohio State University. 1980Kenny Road. Columbus. Ohio 43210-1090. 1977.100 pages
This document is described as a decision-makingguide for improving guide for improving careerguidance through evaluating program processesand outcomes Although it is part of the RuralAmerica Series. its value is not limited to ruralsettings.
Evaluation and Education Decison-Making: AFunctional Guide to Evaluating Career Educa-tion. Malcolm B. Young and Russell G. Schuh.Development Associates, Inc.. 1975. 88 pages.
This guide. which was developed under fundsprovided by U.S.O.E.'s Office of Career Educa-tion. is intended to aid specifically in the evalua-tion of career education programs. In addition toproviding a step-by-step model for evaluation, itlists the Summary of Results of a Career Educa-tion Instrument Review conducted in 1974-75.
35
37
NOTES
36
38
KEY PROJECT STAFFThe Competency-Based Career Guidance Module Series was developed by a consortium of agencies. Thefollowing list represents key staff in each agency that worked on the project over a five-year period.
TM 'Mout Center for Research In Youngest Education
Harry N. Drier Consortium DirectorRobert E. Campbell Project DirectorLinda A. Pfister Former Project DirectorRobert Showmen Research SpecialistKaren Kimmel Boyle Program AssociateFred Williams Program Associate
American Institutes for Research
G. Brian JCR**Linda PhillipsJonesJack Hamilton
Project DirectorAssociate Project DirectorAssociate Project Director
Uo Neatly of allesowl-Columble
Norman C. Gysbers Project Director
American Aeseelstlen for COINISONO. and Development
Jane Howard Jasper Former Project Director
American *pullout Association
Wayne LeRoy Former Project DirectorRoni Posner . Former Project Director
U.S. DePertment of Education. Office of Adultand Vocational Education
David Pritchard Project OfficerHolli Condon Project Officer
A number of national leaders representing a variety of agencies and organizations added their expertise to theproject as members of national panels of experts. These leaders were--
Ms Grace BasingerPast PresidentNational Penni-Teacher
Association
OF Frank SomeFormer Executive Director
Ms Jane RezeghlEducation CoordinatorAmerican Coalition of Citizens
with Disabilities
Mr Robert L CraigVice PresidentGovernment and Public AffairsAmerican Society for Training
and Development
Or Walter DavisDirector of EducationAFL.C10
Dr Richard D.EugsnioSenior Legislative Associate(representing Congressman Bill
Cloodring)House Education and Labor
Committee
Mr Oscar OptnesAdministrator (Perked)U S Department of LaborDivision 01 Employment and
Trimurti
Dr Robert W GloverDirector and ChairpersonFederal Committee on
ApprenticeshipThe University ol Texas at Austin
Or. Jo HerdicDirector of Planning and
Development in VocationalRehabilitation
New Hampshire State Departmentof Education
Mrs. Madeleine HemmingsNotional Alliance for Business
Or. Edwin HerrCounselor EducatorPenneylvanie Slate University
Or. Elaine HouseProfessor EmeritusRutgers University
Dr. David LaceyVice PresidentPersonnel Planning and Business
IntegrationCIONA Corporation
Dr. Howard A. MatthewsAssistant Staff ()KeelerEducation (representing Senator
Orin O. HatchlCommittee on Labor and Human
Resources
Or. Lee McMurrinSuperintendentMilwaukee Public Scheel*
Mt MOM) MeikielohnAssistant Director of LegislationAmerican federation of Stets.
County. and Municipal Employees
39
Or. JoeePh D. MillsSiete Director of vocational
EducationFlorida Department of Education
Or. Jock MyersDirector of Health Policy Study and
Private Sector Initiative StudyAmerican Enterprise Institute
Mr. Reid Runde*Director of Personnel DevelopmentGeneral Motors Corporation
Mrs. Dorothy ShieldsEducationAmerican federation of LAW
Congress of industrialOrganizations
Or. Barbara ThompsonFormer State SuperintendentWisconsin Department of Public
Instruction
Ms. Joan Wills()VectorEmployment and Training DivisionNational Governors' Association
Honorable Chalmers P. WylieCongressman/OhioU.S. Congress
(1
BEST COPY AVAILABLE
Deve loped By
CIO OINK =IVIN KUNG IN VOCNINIAL BOCA=
Igo woo CLUNK* OHO 4V11)OHIO SIAN Miry
. , .r:i..........,, 4.4 -4.... , -,!. 0 "k ' 4 ttrt
0' P.1 . ..Ar . 1 N ),.,..!,ite, , . VT/. 1 Ii ,fin .. 41. ),,o, ;ver,:f ::T.tir4a
"1 1:
1;4
-.1
or
It v,.' ,+: ..
,I '; '"
1'14 .4
.411.,,, Stsie114 Use and Coat* with ate'.. 's__
4 4MiChOniSMS % :. *
1
110 '
CATEGORY IIMPLISIENTING,Cot course& individuate and Groups
If. ,C.2 Tutor ClientsC4 Conduct Computerised Giiidancis::'11- ,.'04 Wu* Currioulum-Ileasd GuidancONV,C4 Coordinate Canie0Ndedumalrdefite041, . Promo 41.:1
t:1.4 is .0 4.47./ . :%IlitiA ., tfp 0tyt11 4 . 4'i- A4;.4 I. A : + ' .: ..');;t. , VI. ' t 4'. , ,,, i .. 7;. w
i' tt 4..11,1
' . '_ __aZt.
' .7;:d *.,:%. i't" t.414"1 If
031. :r31:i.V11/44kg.te 3.4 :41+1 -41-4(4.k!(041;.' ate r e. '44{1.*
e- 4011 . '_. ,41.404 r.t
Millri#14i? 014
101111/1
SIPS... v I
1t
,I
.ALAI.
*Pk r v.4 0.! , 01, ' '. . e; .#f I '. t
,904 ."('' 0-,it'llosolopEthltaitond Standstdst
CATI10011, Ot °PIRATING0.1 Ensure Program Operation04 Aid Professions, Growth
COMMON 6 (MANNINO . -
, Evaluate Guidance AGSMs* ,4
, el. ift11,1000 trig -:74 4te« 474e. . *Our
11 4 ItIt:4 110..est.
CVt!". . 4, 4 41,4s . 0:4 i ft./ '44' '7.1 41V*%.0t. ti
fite / ikg 0 te *.
r f %/4.nt*, *.t.1.10% '
,irf . .'* tr*? 44}e,44rit '... to'L 14..tet'
Published and distributed by Bell Nowell Publication Systems Division,Old Mansfield Road, Wooster, Ohio 448919000. 1400-32198111 or in Ohio call (216) 264-8600.
El HOWELLPublication Systems Division
Put:Lc-aeon Produc!s
MOiSON 0. 911425-3