Impact of teaching development programmes in higher education · PDF fileImpact of teaching...

Post on 15-Mar-2018

221 views 6 download

Transcript of Impact of teaching development programmes in higher education · PDF fileImpact of teaching...

Impact of teaching development programmes in higher education

Professor David Parsons, Dr Inge Hill, Dr Jane Holland, Dick Willis

HEA research series

2

The Higher Education Academy (HEA) provides a high quality

evidence base for learning, teaching and assessments by

commissioning and disseminating cutting-edge research, and

by initiating timely and topical policy debates and responses.

3

Contents

Section

Foreword Acknowledgements

1 Section 1: Introduction 1.1Thestudy 1.2Backgroundandscope 1.3Approachtothereview 1.4Thereport

2 Section 2: Understanding programme impact 2.1Introduction 2.2Settingthepolicycontext 2.3Teachingdevelopmentincontext 2.4HEteachingdevelopmentanditseffectiveness 2.5Thescopeoftheevidencebase

3 Section 3: Issues emerging 3.1Introduction 3.2Evidenceofimpactsonteachers’attitudes,knowledgeandskills 3.3Evidenceofimpactonteachers’behaviourandpractice 3.4Effectsofdisciplinaryorgenericprogrammefocus 3.5Compulsoryversusvoluntaryparticipationinprogrammes 3.6Studentimpactofteachingdevelopmentprogrammes 3.7Otherprogrammeimpacts

4 Section 4: Impact research methods and models 4.1Introduction 4.2Impactassessmentmodelsandframeworks 4.3Evidencestrengthsandmerits 4.4Specificevidenceimprovementopportunities 4.5Practicalchallengesinimpactevaluation

5 Section 5: Next steps 5.1Introduction 5.2Evidenceofimpacts 5.3Evidenceneedsandgaps 5.4Tacklingthechallenges

Annexes AnnexA:Researchissuesandquestions AnnexB:Bibliography

Page

45

66678

101010111314

1515152021222225

282828303132

3535353639

434344

4

Foreword‘Weexpectourreformstorestoreteachingtoitsproperposition,atthecentreofeveryhighereducationinstitution’smission.’(DepartmentforBusinessandSkills,Higher Education: Students at the Heart of the System,June2011)

Thelast18monthshaveseensignificantreformsinhighereducationacrossthefournationsoftheUK,unprecedentedforageneration.Thoughtakingverydifferentapproaches,thefouradministrationsinEngland,NorthernIreland,Scotland,andWaleshaveallintroducedfar-reachingreformtoimprovehighereducationprovisionbyincreasingtheemphasisonlearners.Criticaltoachievingthisarehighlytrainedandinnovativeteachingandlearningsupportstaff.

TheHigherEducationAcademy(HEA)playsakeyroleinthis.Asthenationalbodyforenhancinglearningandteachinginhighereducation,acentralpartofourfocusisontheaccreditationofinitialandcontinuingprofessionaldevelopment(CPD)programmesdeliveredbyhighereducationinstitutions.AccreditationprovidesexternalconfirmationthatthisinstitutionalprovisionisalignedwiththeUKProfessionalStandardsFrameworkforTeachingandSupportingLearninginHigherEducation(UKPSF).Itisawidelyrecognised–andverifiable–indicationthattheprovisionhasmettherequiredstandard.

Perhapssurprisinglygiventhecommitmentto,andinvestmentin,theenhancementofhighereducationprovisionbysuccessiverecentgovernments,therehasbeennocomprehensivesurveyoftheimpactofTeachingDevelopmentProgrammes(TDPs).ThisreportbyHOSTPolicyResearchprovidesanup-to-dateoverviewoftheresearchundertakeninEuropeandtheUnitedStates,inparticular,andhighlightsitsstrengthsandlimitations.ItalsopresentsachallengingsetofrecommendationsfortheHEAandthehighereducationsectortoconsiderinorderbettertoinformfuturedecision-making.

Ofparticularimportancearethecallsforanagreedmethodologyfortheimpactassessmentof TDPsandfortheestablishmentofcross-institutional,longitudinalstudiesoftheirimpact,whichwillovercomethelimitationsofrecentandrecurrentsmall-scaleandshort-termevaluations.Weoweittostudentstoprovidethemwiththebestpossiblelearningexperienceduringtheirtimeinhighereducation,andexcellenceinteachingisonewayofachievingthis.Welookforwardtoworkingwiththehighereducationcommunitytotranslatetherecommendationsintoaction.

Theresearchisatimelycallforconcertedefforttoaddressakeyissue.  Professor Craig MahoneyChief ExecutiveHigher Education Academy

5

Acknowledgements

Areviewofthisnaturedrawsonverywideexperienceandnotonlyfromtheover130authorsandco-authorswhosepublishedresearchhasbeenthefocusofthisresearch.OnbehalfoftheresearchteamatHOST,Iwouldliketothankthoseauthorsfortheircontributionstothisgrowingfieldofinquiry,andalsotheindividualsandagencies–intheUKandelsewhere–whohavecontributedtheirexperiencesandsuggestionsforsourcematerial.

AnumberofHEAstaffhavealsomadedirectcontributionsbasedonpastresearchandevaluationineachofthehomecountriesoftheUK,andothersintheHEAhavecontributedtomarshallingsomeofthediscipline-specificavailableevidence.AspecialthanksareduetothoseinHEA,andoutside,whohavedrawnthisreviewtotheattentionofcolleaguesinsomeoftheUKprofessionalnetworks,andtothealwayshelpfulandencouragingcontributionsoftheProjectSteeringGroup.

FinallywewouldliketothankbothProfessorKeithTrigwellandTrevorHabeshawfortheirvaluableinsightsandhelpfulcommentaryonouranalysis.Ourconclusions,however,togetherwithanyerrorsoromissions,aresolelytheresponsibilityoftheauthors.

Professor David ParsonsHOST Policy Research

7 September 2012

6

Section 1: Introduction

1.1 The study

InMay2012,theHigherEducationAcademy(HEA)askedHOSTPolicyResearch(HOST)toconductanintensivereviewoftheimpactofteachingdevelopmentprogrammesinhighereducation(HE).Thisisatimelyassessment,whichcoincideswiththeadoptionofarevisedframeworkforprofessionalstandardsbytheHEAandgrowingpolicyinterestinteachingdevelopmentofacademicstaff.Italsocomesatatimeofgrowingscholarlyinterestinimprovingteachingquality,withawideningresearchbaseprovidingthefocusofthisreview.

Thisreportfollowsanintensivereviewperiod.Itispresentedasastate of the art,evidence-basedassessmentoftheimpactofHE-basedteachingdevelopmentprogrammesandinitiatives.Italsolooksatthestrengths(andweaknesses)oftheresearchbasefromwhichtheevidenceisdrawn,andreflectsonthelessonsfordevelopingevidence-basedfuturepolicydevelopments.

1.2 Background and scope

Inmostdevelopedeconomies,andwidelyinEurope(Parsonset al.,2010),teachersinHEarenotrequiredtoholdaccreditedteachingqualificationseitherbystatute,standardorconvention1.Somecommentatorshavecharacteriseduniversityteachersasthelastofthe‘non-professions’(Baume,2006),andelsewhereineducationincludingnon-HEareasofpost-compulsoryeducation,allteachingstaffarerequiredtobequalified(orqualifying)2.TheUKisnotunusualinreflectingthispicture,butthesituationischangingandhasseenrisingactivityinpromotinganddeliveringteachingdevelopmentstrategies,especiallysincethe2003EnglishHigherEducationWhitePaper(DfES,2003).

TheapproachesthathavebeenputinplaceacrosstheUKhavebeenverydiverse.Thesereflectthedifferentpolicycontextsacrossthefournations,originsofdifferentpartsofthe‘HEsector’andespeciallytheindependentpedagogictraditionsofdifferentinstitutions(Gibbset al.,2000)anddisciplines.Thevarietythathasemergedhasencompassedpredominantlyinstitutionalprogrammescombinedwithsomenationallysupportedprogrammes,aswellassubject-focusedinitiativesincludingsomeemergingareasforpublicpolicy,suchasentrepreneurshipeducation.

PublicpolicyintheUKhasplayedanimportantroleinstimulatingthesedevelopments,especiallyinrecentyearsincludingwiththeestablishmentoftheHEAin2004,whichisdevotedtotheenhancementofthequalityandimpactoflearningandteachinginHE.Thishasincludedaseriesofcross-institutionalandpartnershiparrangementsincludingtheHEFCE-fundedCentresforExcellenceinTeachingandLearning(CETL)programme(EnglandandNorthernIreland)2005-2010,theScottishQualityEnhancementThemes(2003-present),Wales’FutureDirectionsinitiative(2009-present)andtheHEA’scontinuingUK-widediscipline-specificsupport.

1 Forexample,bycommonlyappliedHErecruitmentorselectionpractices.2 Pre-qualificationteachingstaffonschool-basedtrainingprogrammeshasbeenexpandingintheUK,and

since2002allteachers,trainersandtutorsinpubliclyfundedfurthereducationandskillsprogrammesarerequiredeithertoholdanapprovedteachingqualificationorbeundertakinganaccreditedprogrammeofpedagogictrainingleadingtoanapprovedqualification.

7

Nonetheless,intheUK,deliveryhasemphasisedinstitutionallyledstrategiesandprovision.Here,institutionalapproacheshaveevolvedmostlyindependentlybutundersomecommonstimuli.Inrecentyears,thishasincludedthe(nowrevised)UKProfessionalStandardsFramework(UKPSF)forteachingandsupportinglearninginHE,whichprovidedafocusthroughaframeworkofcommonstandards3,andencouraginginstitutionstodevelopandapplyteachingdevelopmentprogrammesfittedtothespecificneedsofdifferentacademicandotherstaff.

HEteachingqualificationshavebeenapartofthis‘framework’approach(e.g.thePostgraduateCertificateinAcademicPractice(PGCAP))but,althoughqualificationshavebeengainingcurrency,thedirectrelationshipofinstitutionalteachingdevelopmentinitiativestoqualificationsremainsvariable.Formal,sometimesmandatory,butnon-qualificationapproacheshavedominatedmostinstitutionalandotherarrangements,anddevelopedunderdifferentstrategicandfundingstimuli.However,fornewandaspiringacademicstaff4thequalificationpathwayisbecomingamoreestablishedfeatureofinstitutionalstrategies.

ApproachestoteachingdevelopmentinUKuniversities–asoutside–donotstandstill.Thesecontinuetoevolve,andteachingdevelopmentinHEhascorrespondinglyattractedgrowingresearchinterest.Thisinturnhasresultedinawideningliteratureaimedatscholarlyinquiryandknowledgeexchange.Muchofthishastraditionallyfocusedonprocesses–thewaysinwhichteachingdevelopmentprogrammesaredevelopedanddelivered.Therehasbeen,untilrecently,muchlesssystematicresearchinterestonprogrammeoutcomes,andthedifferencetheymaketoparticipantsandpractice(Hanburyet al.,2008).

IntheUK,withgovernmentandexecutiveadministrationsinthefourhomecountrieslookingtoteacherdevelopmentprogrammestoboostteachingqualityandinstitutionalresponsiveness,impactevidenceisbecomingmoreandmoreimportant.Thisstudyaimedtoexploretheavailableevidenceandinparticularto:

a reviewtheliteraturethatexistsontheimpactandefficacyofteachingdevelopmentprogrammes,inordertodrawconclusionsabouttheelementsofprogrammedesignanddeliverythatappeartohavethegreatestimpactontheimprovementofstudentlearning;

b reviewtheliteratureinordertomakespecificrecommendationsaboutthemethodandnatureoffutureresearchintotheefficacyandimpactofteachingdevelopmentprogrammes;

c assessthegapsandweaknessesintheavailableresearch,withaviewtoidentifyingthemes,prioritiesandmethodologyforanyfutureresearchinthisarea.

AnumberofsubsidiaryresearchquestionsweresetbytheHEAandthesehavebeenamplifiedintoaseriesofspecificevidencecollectionstrands(seeAnnexA).ThescopeofthestudyreflectstheHEA’sUK-wideremit,butaccounthasalsobeentakenofinternationalevidencetoprovideacomparativecontextfortheresearchandtodrawonevidenceandlessonsemergingfromteachingdevelopmentprogrammesoutsidetheUK.

1.3 Approach to the review

Ourapproachtoaddressingtheseinformationneedshascentredonanessentially‘secondary’researchmethodology.Thishasaimedtoidentifyandmakebestuseofwhathadbeenexpectedtobewidelydispersedpublishedsourcesincludingkeyagencyevidence,arangeofliteratureincludingscholarlypublications,aswellasotherresearchandevaluationevidence.Thereviewhasinvolvedafour-stagemethodology:

3 SpecificallytheUK Professional Standards Framework,whichissector-ownedandprovidessupportforthedesignandstructureofinstitutionalteachingdevelopmentprogrammes.

4 Forexample,throughgraduateteachingsupportorassistantdevelopmentprogrammes.

8

• Stage1:Projectinceptionandplanning,includingHEAprogressreportingandliaison.

• Stage2:MappingwiderinternationalexperienceofHE-centredteacherdevelopmentprogrammes,andevidencesourcesfromcross-nationalagencies.

• Stage3:SystematicreviewofliteratureanddocumentationagainsttheHEA’sresearchquestions,togetherwithagapanalysisofcoverageandvalidity.

• Stage4:Collationandreportingincludinganinterimanddraftfinalreport(s)andtakingintoaccountHEAcommentsinthisfinalreport.

ThisapproachhasbeenputtogetherwiththenecessaryintensityofthestudyinmindandtoensuretimelydeliveryoffindingsandrecommendationsonfutureprioritiesfortheHEA.Beyondasystematicliteraturereview,wehavealsosoughtwiderevidencethroughacallforongoingor(asyet)unpublishedresearch-basedevidenceandthroughselectivesocialmedia5.

Awiderassessmenthasalsobeenconductedofcross-nationalevidenceavailablefromselectedEuropeanandotherinternationalagencies.Thishasprovidedlittleevidence,establishingthatwhilesuchagencieshaveapolicyreviewinterestintheoutcomesofnationaleffortstoraiseinstructionalprofessionalisminHEteachingstaff,theyhavenotconductedanysystematicresearchthemselvestounderstandit.

Theliteraturereviewidentified312publishedsourcesofpotentialrelevancefromoursearchcriteria,withjustoverathirdprovingtohavesomespecificrelevance.Manyoftheothersourceswehaveidentifiedeitherweremis-taggedbyjournalordatabasereferencessystems(i.e.theircontentwasnotappropriate),orwerecentredonimpactofteacherdevelopmentoutsideofHE.SomeofthosethatwereHE-centredwereessentiallydescriptivereviewsofteacherdevelopmentinitiativesandhadnocontentrelevanttotheimpactfocusofthisreview.AsmallproportionofthoseselectedfordeeperreviewfocusedontheimpactofstaffdevelopmentondifferenteducationallevelsincludingbutnotspecifictoHE.

1.4 The report

Thedraftreportispresentedinfivesections,whichfollowingthisintroductiontothereviewcomprise:

• areviewofthecontexttounderstandingprogrammeimpactincludingprogrammeevaluationandimpactassessmentintheUKandmorewidely(Section2);

• asynthesisoftheavailableevidenceonachievedimpacts–forteachersandstudents,andonotherprogrammeimpacts(Section3);

• anassessmentofimpactresearchmethodsandmodels,andthestrengthsandmeritsoftheavailableevidenceincludingimprovementopportunities(Section4);

• aconcludingassessmentlookingacrossthereviewandalsosettingoutemergingevidenceneedsandgaps,possibleprioritiesandnextsteps(Section5).

5 Inanattempttoensurethatwehadcapturedworkinhandorunpublishedsources,arequestforinformationwaspostedwithNetworkandLearnandwithintwoLinkedIngroups–HigherEducationManagement(36,378members)andHigherEducationTeachingandLearning(18,668members).Althoughbothsourceselicitedrepliesofinterestandsupport,neitherproducedanyempiricalstudies.TheBritishCouncilwerealsocontactedtoseeifanyoftheirprogrammes,supportinghighereducationdevelopment,wererelevantand,ifso,hadbeenevaluated–theresponsewasnegative.

9

Inaddition,thereportprovidestwosupportingannexes.ThefirstsetsouttheresearchissuesandquestionssetbytheHEA(AnnexA).Thereviewdrawsextensivelyonpublishedresearchbyothers,withcitationsdrawntogetherinthesecondannex–thesupportingbibliography(AnnexB).Wecautionthatmultiplereferencesaremadetosomesourcesacrossthesectionsofthereport.Thisreflectstheparticularsignificanceofsomeofthesesources,andtheirwiderrelevancefordifferentaspectsoftheimpactevidencecoveredinthisreview.

10

Section 2: Understanding programme impact

2.1 Introduction

ThevariousteacherdevelopmentinitiativesthathavetakenplaceintheUKneedtobesetagainstawidercontextforHE.Thissectionlooksatsomeofthisbackclothandinparticularat:

• teachingdevelopmentactivitiesincontext;• HEteachingdevelopmentprogrammesandtheireffectiveness;• understandingtheevidencebase.

Thiswidercontextcontinuestoevolve,butintheUKischangingparticularlyrapidly.Ourstartingpointisconsequentlytolookatsomeofthefactorsaffectinginstitutionalandotherresponsestoteachingstaffdevelopment,andteachingimprovementasaqualityinstrument,inthepolicycontextforHE.

2.2 Setting the policy context

Thelast25yearshaveseenHEintheUKexpandgreatly.Thishasbeenaccompaniedbyimportantstructuralandfundingchangesledbyasuccessionofpublicpolicyreviews.Apartofthishasbeenasubstantialincreaseinthefocusoneducationaldevelopmentforacademicstaff.

SpecificpolicydriversforthischangehaveincludedtheEnglishWhitePaper,TheFutureofHigherEducation(DfES,2003),whichprovidedforasubstantialpublicinvestmenttoencouragegoodteachingpracticeandtorewardthosewhoareexcellentinteaching.ThiswasreflectedtheHigherEducationFundingCouncilforEngland’s(HEFCE)StrategicPlan(2003),whichmadecorrespondingcommitments,endorsingtheaimtoimprovethestatusandrecognitionofexcellentteachingandlearningasakeyelementinthemissionofHE,alongsideresearch.Publicpolicyacrossthefourhomecountrieshaschosentopromoteinstitution-ledapproachesandstrategies,withencouragementalsoforpartnershipactivityincludingthe2005-2010CETLprogramme(EnglandandNorthernIreland).

ThemostrecentpolicyreviewforHEinEngland,theCoalitionGovernment’sHigherEducationWhitePaperStudents at the Heart of the System(BIS,2011),setoutaprogrammeoffurtherreformsandavisionforbuildingaworld-classHEsector.Centraltothecurrentreformsistheprinciplethatwhenentrantsfacemuchhigherdirectcosts,albeitlargelydeferred,inordertoparticipateinHE,providershavetheobligationtobemoreresponsivetostudentchoiceanddemand,andshouldbefreetorespondtothoseneeds.Informedchoicebyprospectivestudentsisattheheartofthesedevelopments,withmeasuresofteachingqualityoneofthekeyfeaturestoinformstudentdecisions.TheScottishGovernment(2012)hasrecentlypublishedapre-legislativepapersettingoutthenextstepstheywilltaketodeveloptheirproposalsforthedeliveryoftheirmanifestocommitmentsforpost-16education.Theirproposalsdonotincludethewholesaletransferofthefinancialburdentothestudent,astheydoinEngland.ThedevolvedadministrationsofNorthernIrelandandWalesarealsoreviewingpolicyinthisarea.However,inallthreecases,regardlessofthefunding

11

regime,itislikelythatmanyofthepressuresregardingperceptionsofqualityandvalueformoneywillbeshared.

Asanimportantearlystep,theGovernmenthassubsequentlydevelopedtheKeyInformationSet(KIS)togiveprospectivestudentsaccesstohighqualityinformationaboutdifferentcoursesandinstitutions,enablingmoreinformedchoices.Inclusionofthesedatareflectsanacknowledgementthatinanenvironmentofrisingfeesanddeferredcostsforindividuals(i.e.throughstudentloans),studentsarelikelytoconsiderthequalityofteachingbeforemakingtheirselectionofHEIatwhichtostudy:

... the increase in tuition fees for English students will mean that the sector will need to focus more than ever on ensuring educational quality. Students, quite rightly, demand value for money, and institutions will have to concentrate on further establishing their effectiveness in order to justify higher fees - the quality of learning and teaching will be key. (Mahoney, 2012)

Thefocusondemonstrablyenhancing(andmeasuring)institutionalteachingqualityis,ofcourse,hardlyanoveldevelopment,andisbetterregardedasacontinuationofalong-standingtrendtoincreasethefocusonqualityofteachingasanissueininstitutionalcompetiveness.However,thisisgivenaddedimpetusasoneofanumberofconsiderationslikelytobecomeincreasinglyprominenttoinformandaidinstitutionalandsubjectchoicesbyprospectivestudents.

Theinterestinteachingquality,itsimprovementthroughstaffdevelopmentandtheunderstandingofitseffectiveness,isconsequentlynotnewandisnotfuelledwhollybytheacceleratingdemandsofstudentchoice.Nonetheless,itcouldbereasonablyarguedthatundergraduateeducationintheUKinparticular(incommonwiththeUnitedStates),isnowmorethaneverregardedasacommoditywithinacompetitive(mainly)domesticmarket.Participants’choicesareseennotprimarilyasaspirationalorlifestylechoices,butincreasinglyaslifeinvestmentdecisionsrelatedtoemploymentprospects(Chalmerset al.,2008).

Withinstitutionslikelytobefacingmoreinformedchoicesbyapplicants,finiteorshrinkingdemandandpotentiallyfallingstudentlearningrevenues,teachingqualityemergesasadiscriminatorformanyinstitutions.Thequalityandeffectivenessoftheiractionsregardingteacherdevelopmentaresettomovecentrestageininstitutions’strategicresponsestomanagingtheseandotherchallenges.

2.3 Teaching development in context

Againstthisbackground,publicpolicy(andfunding)hasencouragedahigherprofileandstrategicapproachestotheimprovementofteachingqualitybyinstitutions(Gibbset al.,2000).AnaddedimpetusintheUKwastheadoptionof‘learningoutcomes’acrossEuropeaspartoftheBolognaProcessanditsassociatedDublindescriptors,whichimpactedontheteachingandlearningmethodsneededtoachievethoseoutcomes(Lindblom-YlänneandHämäläinen,2004).Similarchangesoccurredmuchearlierinsomeotherdevelopedeconomies,mostnotablytheUS,inpartdrivenbystudentprotestsabout“irrelevantcoursesanduninspiredteaching”(GaffandSimpson,1994,p.168).

WithintheUKtherehasbeenrisingactivityinpromotinganddeliveringteachingdevelopmentstrategiesespeciallysince2003.Gibbsestimatesthatthisareaofwork,whichinvolvedonlyaround30activeacademics,mostlypart-time,intheUKinthe1970s,nowinvolvesthousandsofacademicdevelopmentpersonnelandsubstantialinstitutionalinvestments(Gibbs,2012).Others(D’AndreaandGosling,2005;Steset al.,2010)havealsonotedthattheemergenceofdiscoursearoundlearningandteachingisoneofthemoreremarkableHEphenomenaofthelastdecade.

WhathasemergedintheUKfromthissharplyexpandedactivityhasbeencharacterisedasverydiverse.DifferentapproacheshavebeentakentodescribingthisdiversityintheUKbut,whileavoidingclassification,itisclearthat‘programmes’

12

varyfromanexpandingrangeofusuallypart-timeandcertificatedinterventions(McArthuret al.,2004),typicallyofoneyear’sduration,toincludecontinuingeducationthrough(usually)short-term,blockorintensiveworkshopprogrammes(Rust,1998).Thesepost-experienceprogrammesmayalsobecomplementedbymoredynamicmeasuressuchasspecificteachingstaffdevelopmentactivitiesincludingsystematicmentoringandobservation-basedvideofeedback(SchreursandVanVilet,1998)orwhathasbeencollectivelycalled“micro-teaching”(Trigwell,2012)andalso“portfoliowork”( JarvinenandKohonen,1995).Diversity,ofcourse,goesbeyondmodeofdeliveryandincludesdifferentprogrammefocusesandpurposesrangingfromchangingpracticetochangingperceptions.

ResearchershavesuggestedthisUKdiversitystemsinparticularfromthefragmenteddevelopment(andorigins)ofthe‘HEsector’,aparticularfeatureofthisseemstobetheindependentpedagogictraditionsofdifferentinstitutions(Gibbset al.,2000).Earlyanalystsofthesedevelopmentsalsonotedtheimportanceofdifferentinstitutionalemphasesofabroaderspectrumofknowledgeandskills(Nasret al.,1996).InsomesituationscohesionhasalsobeenpositivelyinfluencedbywiderqualityassuranceinitiativessuchasthatledbytheHEAandQAAfortheWelshAssemblyGovernment(HEA/QAA,2009).

Inrecentyears,UKdevelopmentshaveseenanembryonictrendtowardsgreatercohesion,inparticularthroughthe(nowrevised)UKPSF,providingaframeworkofcommonstandardsandabasisforsystematicaccreditationofdifferentinstitutionalapproaches.Thishasencouragedinstitutionstodevelopandapplyteachingdevelopmentprogrammesfittedtothespecificneedsofdifferentacademicandotherstaff.

Whilecommonqualificationshavebeenanon-mandatorypartofthis‘framework’approach(e.g.thePGCAP),therelationshipofqualificationstoformalinstitutionalprogrammesforteachingdevelopmenthasbeenvariable.Formal,sometimesmandatory,butnon-qualificationapproacheshavedominatedmostinstitutionalandotherarrangementsanddevelopedunderdifferentstrategicandfundingstimuli.However,thequalificationpathwayfornewacademicstaffisbecomingmoreafeatureofinstitutionalstrategies.

RecentanalysishassuggestedthatintheUKtherehavebeencontrastsbetweengenericdevelopmentapproaches,usuallylocatedcentrallywithinHEIsandonoccasionsHEIpartnerships,andspecific(usually)disciplinary-basedapproaches(Gibbs,2012).Thelatterareoftenseenascollaborativeandsometimesledbynationallybaseddisciplinaryassociationsorprofessionalgroupings.Whilegenericteachingdevelopmentprogrammeshavebeensaidtobebecomingmoresensitivetodisciplinarydifferences,thegenericandspecificapproachesoftenhavelittleinteraction(Gibbs,2012).Here,somecomparativeevidencefromAustralia,NewZealandandtheUnitedStatessuggeststhatdifferencesindisciplinarycontextsarereflectednotjustindisciplinarypedagogies,butdisciplinaryculturesunderpinningteachingcognition(Kaneet al.,2002).

Insomeinstitutionsteachingdevelopmentapproacheshavebeenmandatoryfornewlyappointedacademics,orduringtheirearlycareer,butcompulsionhasnotbeenafeatureofthedevelopmentofestablishedstaff.Tothiscanbeaddedvoluntaryprovisionforaspiringacademicsincludinggraduateteachingsupportorassistantdevelopmentprogrammes.Lookingacrossthislegacyandcontinuingdevelopments,Gibbs(2012)hasdescribedthisareaofactivityashavinganumberofcomponents(notlistedhere)includingthedevelopmentofteachersasindividualsandgroups.Gibbs’recentassessmentalsolistsaseriesofdiscernibletrendsineducationaldevelopmentoverrecentdecades,andspecifically:

13

• emergencefromafocusontheclassroomtoafocusonthelearningenvironment;

• changingemphasesfromindividualteacherstoafocusoncourseteamsanddepartments,andalsoleadershipofteaching;

• aparallelchangefromafocusonteachingtoafocusonlearning;

• adevelopingemphasisfromchangetacticstochangestrategies;

• achangingfocusfromqualityassurancetoqualityenhancement;

• achangingfocusalsofrom‘finetuning’ofcurrentpracticetotransformingpracticeinnewdirections.

Atthesametime,Gibbssuggeststhatthesetrendshavebeenaccompaniedbyanumberofconceptualchanges:psychologicaltosociological;atheoreticaltotheoretical;experientialandreflectivetoconceptualandempirical;unscholarlytoscholarly;amateurtoprofessional;andcontextneutralorcontextblindtocontextanddisciplinesensitive(Gibbs,2012).Othershaveseenthesetrendsandconceptualshiftsasreflectingthecomplex,multi-factorialnatureoftheprofessionaldevelopmentofHEteachers.

2.4 HE teaching development and its effectiveness

IfresearchersarestartingtoshowcommongroundonwhatteachingdevelopmentinHEconstitutes,therearevariousmeasuresofhowinstitutionshavebeenincreasingtheirfocusonteachingquality.Onereadymeasureistheextenttowhichteachingqualityhasbecomeacomponentoftheirhumanresourceandperformancemanagementprocesses.In1994,institutionsreportedmaking,onaverage,onlyjustunderoneineight(12%)ofpromotiondecisionsprimarilyonthegroundsof‘teachingexcellence’.Atthesametime,justoverathirdofallinstitutions(38%)reportednotmakinganysuchpromotions(Gibbs,1995).Thissituationseemstohavechangedrapidly.Althoughitisnotadirectcomparison,itnonethelessseemsthatjustsixyearsafterGibbsanalysissuggestingteachingperformancewasaminorvariableinstaffprogressiondecisions,theproportionofinstitutionsincludingrecognitionandrewardmechanismsintheirlearningandteachingstrategyhadincreasedto65%(HEFCE,2001).

Theprovisionanduseofdataonteachingqualitytosupportstudentchoice(aswellasinstitutionalperformanceassessments)iscontroversialanditsutilityisnotstraightforward.Researchershavesuggestedthatthereisnogenericdefinitionofgoodteachingthatsuitsallcontextsandstudentcohorts(Donnelly,2007).Muchwilldependoncontext–arecurrentfeatureemergingfromtheresearch–andalsowhatteachingframeworksormodelsarebeingapplied,towhomandwhere.

Asaresult,tryingtodeterminewhetherornotgoodteaching–ofanykind–supportsorencouragesgoodlearningisseenasextremelydifficult.Gibbs(2010),oneofthefewresearcherswithcross-nationalevidence,notesthat(atleastuntilveryrecently)comparativeindicatorsofqualitycurrentlyavailableintheUKareunlikelytoprovideprospectivestudentswithavalidbasistodistinguishbetweenindividualcourseswithregardtotheireducationalquality.

Gibbsgoesfurthertosuggestthatonthesefoundations,thecollationofcurrentlyavailabledataintoinstitutionalorsub-institutionalleaguetablesislikelytobeatbestmisleadingandatworstinaccurate.Hesuggeststhatthebestpredictorofeducationalgainismeasuresofeducationalprocesses–primarilythosethatconcernasmallrangeoffairlywell-understoodpedagogicalpracticesthatengenderstudentengagement(e.g.highqualitydirectfeedbackonstudents’assignments).

14

Theargumentischallenging,notleastbecauseintheUKthereareverylimiteddataaboutthedistributionandprevalenceoftheseeducationalpractices.Thisisbecausearrangementsforqualityassurance,andinstitutionalreviewandcomparison,donotsystematicallydocumentsuchevidence.Norarethey(inthemain)thefocusofthecomparativeNationalStudentSurvey.

ThissituationintheUKisincontrasttosomeinternationalexperience,notablyintheUSwheretheNationalSurveyofStudentEngagement(NSSE)hasbeenusedsuccessfullybymanyinstitutionstobenchmarkandidentifyweaknessesincurrenteducationalprocesses.Thesamesourcehasalsoprovenabaselinetodemonstratethepositiveimpactoftheintroductionofcertaineducationalpractices.Here,ithasbeenreportedthatpoolingdataacrosssuchinnovationsprovidesavalidbasistoguideotherinstitutionsintheadoptionofpracticesthatarelikelytobeeffective.However,theNSScannotbeusedinthesameway(Gibbs,2010).

Againstwhatmightbeseenasaconfusedsituationforunderstandingtheeffectivenessofinstitutionalactivities,issuesofmeasurementofteachingquality(andtheiruseastoolsinevaluation)arenowemergingasafocusforeffortintheUK.However,thisissueisnotconfinedtotheUK,althoughthepolicycontext,andleversofchange,aredifferentinothercountries.Bythelate1990stherewerecallsinAustralia,theUS,Canadaandelsewhereformoresystematicresearch-ledapproachestoeffectivemeasurement.Bytheendofthedecaderesearcherswerecallingforaninternationalcollaborativeresearchprogramme(GilbertandGibbs,1999)toprovideafocusforknowledgesharing.Theneedtoensureanappropriateevidencebasecontinuestobehighlighted(Chalmers,2010),withrecentcommentatorsfavouringcombiningquantitativeinputandoutputindicatorsincombinationwithqualitativeprocessandoutcomeindicators(Chalmers,2010;Steset al.,2010a).

2.5 The scope of the evidence base

Thisstudyhasfocusedontheavailableinstitutionalandcross-institutionalevidencefromprogrammeevaluationandrelatedresearch.Itsparticularfocushasbeenpractical,evidence-basedassessmentsofimpactandimpactdeterminants.Suchevidencehaslongattractedscholarlyinquiry,andby1981Levinson-RoseandMengeswereabletoidentifysome71studiesfromthemid1960softeachingdevelopmentprogrammesandinterventions(Levinson-RoseandMenges,1981).However,theauthorsalsoprovidedanearlydiagnosisofoneofthecentralchallengesofthoseseekingtounderstandprogrammeimpacts,andwerehighlycriticalofthequalityofmuchoftheearlyresearch.

Nearlytwodecadeslaterlittlehadapparentlychanged.In1998,WeimerandLenzesoughttoextend(byscope)andupdatethe1981assessmentfocusingonliteraturefromthe1980s.Althoughadoptingamoregenerousdefinitionoftheresearchinscope,andacknowledgingtheexpansionofscholarlyinquiryinthisarea,theyfelttheirreviewprovidedinconclusiveevidenceofthepositiveeffectsofprogrammes.Liketheirpredecessors17yearsearliertheyalsocalledforsystematicapproachestoprovideformoreandhigherqualityresearch.

Evidencedrawnfromthe1970sand1980sinevitablylackscurrency,althoughitmaystillhavelessonsforissuessuchasviablemethodologies.However,laterreviewsoftheresearchandevaluativeevidencebase(KreberandBrook,2001;McAlpine,2003;Prebbleet al.,2004)havealsotendedtoreinforcetheseconclusionsaboutthefragmentationandoftenlackofcoherenceofmuchoftheevidencebase.

Allofthesestudies,andmanyofthecontributionstheysoughttoreview,arereturnedtointhenexttwosectionsofthisstudy.Thisreviewaimstoharnessthisgrowingevidencebasetolookveryspecificallyatprogrammeimpact,toidentifyanycommongroundontheirimpactassessment,andtobetterunderstandthestrengthsoftheevidencebaseanditslimitations.

15

Section 3: Issues emerging

3.1 Introduction

The‘reviews’citedinthelastsectionhavebeenastartingpointforthisstudy.Thereviewprocesshasbeendescribedearlierandhasfocusedon108identifiedsourcesmeetingourcriteriaofbeingevidence-basedandreportingon(atleastsome)impactfindings.Thisevidenceisdrawntogetherhereforsixspecificareas:

• impactonteachers’attitudes,knowledgeandskills;• impactonteachers’behaviourandpractice;• effectsofdisciplinaryorgenericprogrammefocus;• effectsofcompulsoryandorvoluntaryparticipation;• theimpactofteacherdevelopmentprogrammesonthestudentlearningexperience;• otherimpacts.

Thisclassificationofevidence,drawnlargelyfromthemostrecentreviewconductedbyStesandcolleagues(Steset al.,2010b),isadaptedtoaddfurthercategoriestheHEAhavesetoutasofparticularinteresttopolicydevelopment.Whileallofthesourcescitedhaverelevancetooneormoreoftheseareas,thescaleandbreadthofthisevidencedrawnonfromindividualstudieshasbeenhighlyvariable–anissuereturnedtointhenextsectionofthereport.

3.2 Evidence of impacts on teachers’ attitudes, knowledge and skills

Stesandcolleagues(2010a)conductedareviewof37publishedsourcesofevidenceontheimpactof‘instructionaldevelopment’inHE.Inawatershedstudymappingthebreadthofavailableevidenceagainsttheirtypologyofimpacts,theyconcludedthateffectonteachers’attitudes,knowledgeandskillswasthemostcommonfocusforanalysisofimpactsofprogrammes.

LookingmorewidelythantheStesreview,itseemsthatrelevantresearchhasbeendrawingpredominantlyonself-assessed‘participant’data,althoughsomestudieshavealsoincludedcontrolgroupcontrastsasawayofassessingadditionality.However,thesestudieslackacommonframeworkforwhatconstitutesimpactonteachers’attitudes,andvariouslylookatconstructsandconceptsofteachingandlearning,intentions,knowledgeandskills,motivationsandself-efficacy.ThereviewbyGuskeyofthenatureoffacultydevelopmentimpacts(2000)hadcollectivelyreferredtotheseas“academics’conceptualchange”.Stesandcolleagues(2010a)wentfurtherandattemptedtoclassifyidentifiedattitudinalimpactsaccordingto:

• impactsonteacherattitudes(changesinattitudestowardsteachingandlearning);• impactsonteachingconceptions(changesinwaysofthinkingaboutteachingand

learning);• impactsonteachingknowledge(acquisitionofneworenhancedconcepts,procedures

andprinciples);• impactsonteachingskills(acquisitionofthinking/problemsolving,psychomotorand

socialskills).

16

Theauthorsdistinguishedbetweenthesechangesandothersinvolvingtransferofthesechangesandacquisitionsthroughchangedbehaviour(reviewedbelow).Theirreviewsshowedsomecrossoverbetweenstudies,withmost(27ofthe36)providingevidenceofoneormoreoftheseimpacts–butnoneofall.

Impactonteachingattitudes:WorkbyStesandcolleagues(2010aand2010b)showedthat,amongthestudiestheyreviewed,themostcommonimpactfocuswasonteachers’attitudes,followedbyteachingknowledgeandskills,butwithlittleemphasisonteachingconcepts.Theresearchevidenceforteachingattitudesistakenheretogetherwithteachingconceptions–thetwobeingcloselyrelatedandoftennotdistinguishedinsomeofthereportingbyresearchers.However,Stesandcolleagueswerecriticaloftheevidencequalityformanyofthestudiestheyreviewedintheareaofteachingattitudes,emphasisingthatwhile:

… positive effects were reported … none of these studies was a control/comparison group used. Only (one study) used a pre-test/ post-test design. (2010a, p. 31)

ThecurrentreviewwentfurtherthantheSteset al.reviewof37publishedstudies.Thisincludedalarge-scalestudyinEuropebyHanburyandcolleagues(2008),whichlookedatchangestoteachingattitudesandconceptionsinover30UKuniversities.Thisdiduseapre-test/post-testdesignbasedontheApproachestoTeachingInventory(ATI)tool(Pintrichet al.,1989;TrigwellandProsser,2004)todiagnosechangesinparticipantsamplesacrosstheseuniversities.Thestudyshowedalargeeffectforprogrammesinachievingconceptualchangesandinparticularashiftinparticipatingteacherstowardsstudent-ledapproaches.

ThestudybyHanburyandcolleagueswasimportantnotonlyforitschangemeasurementapproachandscale,butinprovidingindicativeevidencethatsincechangestowardsstudent-centredteachingapproacheshavebeenassociatedelsewherewithdesirablechangesinstudentapproachestolearning(Trigwellet al.,1999;Trigwell,2012),thenHE-basedteachingdevelopmentprogrammeswouldseemtobehavingpositiveeffectsonteachers’underpinningconceptions.

Postareffet al.(2007)alsoundertookamoresystematicreviewbasedon200HEteachersfromdifferentdisciplinesacrosstwoinstitutions.ThestudywasanattempttobringmoreinformationtothisdiscussionbyexaminingwhetherthelengthoftrainingofuniversityteachershasaneffectonapproachestoteachingmeasuredbytheATIand,furthermore,onself-efficacybeliefs.Thestudyincludedacontrolgroupandwasmixedmethod,involvingtheuseoftheATIandinterviews.Theirassessmentshowedthatthetrainingenhancesashiftfromtheinformationtransmission/teacher-focused(ITTF)approachtoconceptualchange/student-focused(CCSF)approach,butcautionedthatthisisaslowprocess.

Fromthisstudy,onlyafterayear-longprocessofpedagogicaltrainingwereteachersreportedtobemorestudent-centredthanthosewhodidnothavetrainingatall.Ininterviews,teachersmentionedonlypositiveeffectsofpedagogicaltrainingonteachingandthatitmadethemmoreawareoftheirapproachtoteachingandtheirteachingmethods.Theauthorsconcludedthatawarenessofone’sownapproachtoteachingisessentialinimprovingteachingpractices.

Theresultsofthisstudyconfirmedearlierwork(TrigwellandProsser,1996;Akerlind,2003;GibbsandCoffey,2004)thattrainingincreasedtheextenttowhichteachersadoptedtheCCSFapproachtoteaching,butalsothatchangesintheITTFapproachwerenotasstrong.Puttogethertheevidenceimpliesthatprogramme-relatedchangestoapproachestoteachingandself-efficacybeliefstypicallydeveloponlyslowly.Postareff’sevidencesuggestedittakesatleastaone-yeartrainingprocessuntilpositiveeffectsemerge.Infact,shortertrainingseemstomaketeachersmoreuncertainaboutthemselvesasteachers:

… shorter courses might increase degrees of uncertainty, and lower self-efficacy, whilst a longer course increased the teachers’ self-efficacy, and supported conceptual changes. (2007,p.259)

17

Inthisanalysis,theauthorspostulatethatshortercoursesresultinanegativeeffectbecausetheyraisetheteachers’awarenessandmakethemlesscertainabouttheirself-efficacy.Overalongerperiodtheybecomemoreawareofidealwaystoteach.Teacherswithnotraininglacktheself-awarenessandmayfeelthattheyaregood,student-centredteachers–traininginitiallycausesthisperceptiontocollapse.

Theperiodofgestationwasevenlongerforchangedbeliefstotransfertopractice–anissuereturnedtobelow.Thisbroadconclusionissupportedbyotherevidence,withGibbsandCoffey(2004)arguingthatuniversityteachersbecamelessteacher-centredandmorestudent-centredbytheendofthefourto18months’training.Puttogethertheavailableevidencesuggeststhatafteralongtrainingprocess,ashiftfromateacher-centredtostudent-centredapproachispossible(Postareff,2007),butitalsoshowsthattheeffectofpedagogicaltrainingisnotnecessarilylinear.

Unusuallyamongsuchstudies,Postareffandcolleagueswentfurthertoprovidesomelongitudinalevidencefromafollow-up(Postareffet al.,2008).Thislookedagainattheexperiencesofasmallersampleoftheinitialparticipants,contrastingthosewhohadparticipatedinfurtherpedagogictrainingaftertheinitialprogramme(2004),andthosewhonotparticipatedinfurthertrainingsince2004.Thisshowedthatthereweremorepositivechangesinthemeasuredscalesamongteacherswhohadacquiredmorecreditsofpedagogicalcoursessincetheyear2004thanamongteacherswhohadnotacquiredmorecredits.

Donnelly(2007)reportsastudyofaPostgraduateCertificateinThirdLevelLearningandTeaching,anaccreditedcontinuousprofessitonaldevelopment(CPD)programmeforacademicstaffandfacultymembers,locatedinanHEIintheRepublicofIreland.ThisCPDprogrammecreatesaclimateoftrustandrespectthatisapprovingofdialogue,encouragingofopendebate,andsupportiveofrisk-takinginteaching,buildingontheworkofMarshall(2004)whonotedthat:“thepowerofpeerobservationresidesinitsdevelopmentalandcollegialorientationanditsexposureofcolleaguestoaffirmation,constructivecriticism,andtheexperienceofhowothersteachdifferently”(p.187).

Theprogrammewasdesignedtoprovideaforumfordebateanddialoguearoundwhatconstitutes‘goodlearning’forstudentsand‘goodteaching’byacademics.Criticalinsightsontheschemeareofferedthroughasynthesisofrelevanttheoreticalliterature,discussionofthemechanicsandclimateofthescheme,andevaluationsbytheacademicstaffandfacultymembersparticipatingoverthepastfiveyears.Itwasevaluatedusingamixofdata–evaluationforms,interviewsanddocumentcollection.Theauthorconcludedthattheprogrammeaidstheintegrationoftheoryandpractice,anddemonstratesthevalueofinterdisciplinarylearningandhowitcanbenefit,inparticular,thepracticeofteachersnewtoHE.

LookingoutsidetheevidenceofimpactsfromprogrammesinEnglish-speakingcountries,Ménardet al.(2011)inQuebecusedtheTeacher’sSenseofEfficacyScalequestionnairedevelopedatOhioStateUniversity(OSTES)tomeasureindividualteacher’sassessmentsabouttheirefficacyinstudentengagement,instructionalstrategiesandclassroommanagement.ThisquestionnairehadtobetranslatedintoFrench,anditisnotclearwhetherthelongorshortformwasused6.

TheuseoftheOSTESscaleherewasseentohaveworkedbetterforassessingimpactsonteachers’attitudesthanaparallelexerciselookingatimpactsforthestudents(reviewedbelow).However,theOSTES-based‘alpha’coefficientsascalculateddidnotcorrespondwelltothoseinthepaperfromwhichtheinstrumentwassourced.Thiseffectwasapparentlyduetochangingthesizeofthescaleusedaswellasworkingwithasmallersampleofteachers.Forthenextiterationoftheevaluationprogramme,Ménardet al.willreturntotheoriginalnine-pointevaluationscalerecommendedbyHoy(2008).

6 http://people.ehe.osu.edu/ahoy/research/instruments/#Sense.

18

ButcherandStoncel(2012)usedaninstitutionalcasestudyapproachtoexplorethenatureandextentoftheimpactofthePostgraduateCertificateinHigherEducation(PGCHE)onteachersappointedfortheirprofessionalexpertise.Datawerecollectedinfouriterativestagestoinvestigateperceptionsofgraduatesfromthecourse(2006-2009),aswellascurrentparticipants,midwaythroughtheirprogramme.Theresearchinvolvedmixedmethods(documentcollection,survey,semi-structuredinterviewsandafocusgroup).Analysisrevealedthatteacherswerewillingtoadoptnewapproachestoteaching,planningandassessment.Ashiftfromteacher-centredtolearner-centredapproaches(Prosseret al.,2006)wasapparent,togetherwithashiftin‘professionalidentity’.Inparticular,thestudyconfirmedthebenefitsofinterdisciplinarydiscourse.Thesamestudyalsoshowedimpactsbeyondteachers’attitudesandconceptions,onthestudentexperienceandonparticipants’careers,andthesewerealsodiscernibleatdepartmentallevels,.

ManyoftheNorthAmerican,Europeanandwiderstudiesprovidingsomeimpactevidenceonteacherattitudesandconceptsarebasedonspecificprogrammes,oftenreviewedataparticularpointoftimeanduncommonlyfollowedup(atleastinthepublishedliterature)aftertheinitialreviewperiod.Othershavebeendescribedasfocusingonparticularteacherdevelopmentactivitiesorinterventionsfallingshortofwholeprogrammeevaluation(KreberandBrook,2001).Whileanarrowimpactevaluationfocusdoesnotinvalidatefindings,itmakesthedeliveryfocusandcontextoftheprogrammesimportantandsomeresearchershavecommentedthatsuchcontextsareoftennotfullydescribedinliterature,limitingtheirutility(Steset al.,2010a).Italsomeansthatmanyofthestudiesarerelativelysmallscale,andinsomecasesbasedonexperimentalorquasi-experimentalapproaches,afeaturereflectedinotherimpactevidencesources.

Lueddeke(2003),forexample,reportsanexploratorystudytoinquireintotherelationshipbetweenanumberoffactorsthatcharacteriseacademicsworkinginHEandtheirapproachestothescholarshipofteaching.Thefindingssuggestthatdisciplineandteachingconceptualisationhavethestrongestinfluenceonteachingscholarship,whilequalificationsandyearsofteachinghaveamoderateimpact;genderandpositiondonotappeartoplayasignificantpart.Theimpactevidencebasefromsmallsampleshasbeencriticisedbysomeasinsubstantial,butthequasi-experimentalapproachhasalsobeenwelcomedbyothers(Steset al.,20010a)asmakinganimportantcontributionintestingfutureevaluationframeworksforreliability.

Impacts on teachers’ knowledge: Specificevidenceofimpactsonteachers’knowledgeisfragmented.However,twostudiesdidfocusontheuseofteachingdevelopmentprogrammesforbuildingteacherawarenessandskillsinworkingwithstudentswithdisabilities.Getzelet al.(2003),forexample,reporttheresultsofanonlinesurveysentto21universitiesandcollegesfundedduring1999-2002bytheUnitedStatesDepartmentofEducation’sOfficeofPost-secondaryEducation(OPE)todevelopandimplementfacultyandadministratorprofessionaldevelopmentactivitiesaspartofanefforttoensurethatstudentswithdisabilitiesreceiveaqualitypost-secondaryeducation.Responseswerereceivedfrom17institutionsandanalysedqualitativelytoidentifyrecurringissuesandthemes.AkeyconclusionwasthatinorderforCPDactivitiestothriveitisnecessarytobuildpartnershipswithfaculty,administrators,studentswithdisabilities,andotherdepartments.However,theauthorsnotedthatthesurveyframeworkandnarrowrangeofrespondents’roles(ProjectDirectorsorCo-ordinatorsonly)mayhavelimitedtheinformationandresponsesreceived.Theauthorsalsocommentedontheneedtoundertakestudiestomeasurelong-termindividualandinstitutionalchanges.

AsimilarfocuswastakenbySowersandSmith(2003)whoundertookacross-HEIstudyofacomponentoftheHealthSciencesFacultyEducationProject(OregonHealthandScienceUniversity)involving39institutions.Theprogramme,‘ADayintheLifeofHealthScienceStudents’,wasfield-testedwith247Nursing,Medicine,Dentistryandalliedhealthfacultyacrosstheseinstitutions.Participantswereaskedtocompleteasurveybeforeandafterthetrainingandthesewereanalysedtorevealperceptionsoftheabilityofstudentswithdisabilitiestobesuccessfulintheirprogrammes.Theresults

19

demonstratedthatparticipationinthetrainingpositivelyimpactedtheknowledgeoftheparticipantsaboutstudentswithdisabilities.However,thesurveylackedacontrolgroup,wasentirelydependentonself-reporteddataandlackedanyattempttodeterminethelong-termimpact.

Impact on teaching skills: Evidenceofimpactsonteacherskillsisalsomorelimitedand,whereitisavailable,tendstofocusonself-assessedreportingbyprogrammeparticipants.AstudybyDixonandScott(2003),forexample,drewonself-assessedfeedbackthatsuggestedteachersjudgedtheirparticipationinteacherdevelopmentprogrammesashavingincreasedtheirteachingandlearningskills.Thisreportedparticularpositiveeffectsforcreating“optimalandcomfortablelearningenvironments”,timemanagementandenhancingstudentmotivationandinteraction.ThestudybyPostareffandcolleagues(2007)usingfollow-upinterviewswithsomesurveyparticipantsshowedthatteachersreportedthedevelopmentofreflectiveskillsfromparticipatingintheprogramme.

Researchevidenceonprogrammeimpactsonteachingskillsoftenlacksprecisionwitheffectsfrequentlyreportedatgeneralisedlevelsforskillsacquisition.ANorthAmericanbasedreview,byPersellinandGoodrick(2010),focusedonskillsacquisitionandsurveyedpastparticipantsoftheAssociatedCollegesoftheSouthSummerteachingandlearningworkshop.Thisprogrammewasestablishedin1992andistargetedatfacultyfromtheACSconsortiumof16liberalartscollegesanduniversitiesin12statesacrossthesouth.Thefive-dayworkshopaimedtoprovideopportunitiestohoneteachingskillsthroughfeedbackfromsmallmicro-teachinggroupsaswellaslarge-groupplenarysessionsonavarietyoftopics.Byadoptingacross-disciplinaryapproach,theworkshopensuresfeedbackisfromtheperspectiveoflearners,notdisciplinarycolleagueswhoalreadyhavemasteryofthefield.Thesurveyresults(370teachers)suggestthatparticipationhadalastingimpacton“professionaldevelopmentandskills”.Femalerespondents,inparticular,reportedmoreawarenessandthoughtfulnessaboutuseofteachingskills,andreportedhavingtriednewstrategiesandtakingmorerisksintheiruseofdifferentteachingskillssinceparticipation.ThePersellinandGoodrickfindingssupportotherworkbyDonnelly(2007)inIrelandandMarshall(2004)regardingthevalueofpeerfeedbacktotransferofskillsfromteachingdevelopmentprogrammesinanenvironmentoftrust.

InanotherreviewfromtheUS,Romanoet al.(2004)reportedonskillsandknowledgeeffectsofaprogrammefocusedontheteachingandlearningissuesofmid-careeracademics,examiningtheimpactsoftheMid-CareerTeachingProgrammedevelopedin1998bytheCenterforTeachingandLearningServices(CTLS)attheUniversityofMinnesota.Participantsself-assessedimpacts(aself-administeredpost-programmequestionnaire)andasmall,representativegroupwerealsointerviewed.Thestudyemphasisedthevaluetothisgroupofinteractinginastructuredprogrammewithapeergroupofcolleaguestofocusonwaystostrengthenpedagogicalknowledgeandskills.Thisemphasisedtheparticularvalueofembeddedpeerreflectiononbothprofessionalandpersonalchallenges.Noattemptwasmadetoestablishwhetherornotparticipationimprovedclassroompracticeorstudentoutcomes.Thestudylackedacontrolgroupandwasbasedentirelyonself-observation.

Aparticularfocusforthe(limited)evidenceonskillsimpactsisontechnology-basedskills.Twostudies,byHowlandandWedman(2004)andbyKahnandPred(2002),bothusingaquantitativeapproachbasedonpre-andpost-testassessmentsofskillconfidenceanduse,showedparticipantsinprogrammesfeltmorecomfortablewiththeembeddeduseofeducationtechnology.Educationtechnologyhasbeenanimportantfocusforthemedteacherdevelopmentinmanycountries,butotherfocuses,suchasenterpriseeducation,acceleratingprovisionofteacherdevelopmentinitiativesdonotseemtobematchedbysystematicresearchorevaluationintoimpactonparticipants’skillsandpractice.

20

3.3 Evidence of impact on teachers’ behaviour and practice

ThereviewbyStesandcolleagues(2010a)showedthatmanyofthestudieslookingatchangestoteacherattitudes–invariousemphases–alsolookedatissuesofthetransferofchangedattitudesandknowledgethroughimpactsonteacherbehaviour.Thescaleofsomeofthesestudiesislimited–particularlywhereevaluationincludedresource-extensivemethodssuchaspre-testsandpost-teststoself-assessorobservechangedpractice.

Anearlycontribution,usinginstructorpost-testthroughobservation,wasconductedbyNasmithandcolleagues(1995)withasmall-scale‘experimental’reviewconductedsixmonthstofiveyearsafterprogrammeparticipation,andalsoharnessingacontrolgroup.Theresultslookedatuseofinnovativeteachingmethodsrelatedtotheprogrammecontent,butshowednosignificantdifferencebetweenparticipantsandthecontrolgroup.Incontrast,GibbsandCoffey(2004),inamuchlarger-scalereviewusingpreandpost-testanalysis,lookedatchangesayearafterparticipationandshowedthroughtheATItoolthatparticipantsdemonstratedmorestudent-centredapplicationaftertheyearthanthoseinacontrolgroup.Postareffandcolleagues(2007)endorsedthisconclusion,alsousingtheATItooltoshowthatparticipantswithmorecreditsfromparticipationinteacherdevelopmentprogrammesweremorestudent-centredintheirteachingbehaviourthanthosewithless.

DixonandScott(2003)providedmorespecificillustrationsofimpactsonteacherbehaviour.Onasmall-scalesampleofparticipants,theyshowedsomepositiveeffectsforovertwo-thirdsofparticipantsoneachoffourmeasuredindicatorsofchangedteaching‘behaviour’:

• increasedrelevanceofteaching;• interactionwith(and‘movementamong’)students;• encouragingstudentstoaskquestions;• makingeyecontactwithstudents.

However,theoveralleffectsonchangingteacherbehaviourweremixed,andtheyalsoreportedverylimitedimpactsinanothertwotestedindicators(‘availabilityofteachertostudents’and‘awarenessofstudentresponsestotheirteachingstyle’).

Hewsonandcolleagues(2001)alsoshowedpositiveeffects(althoughondifferentbehaviourmeasures),butweremorecautiousaboutthescaleoftransferthanDixonandScott.Theyusedapre-testandsix-monthdelayedpost-testapproachfromaprogrammecentredonmedicaleducatorsandshowedpositiveeffectsfortwoof15testedteachingcompetencies,withsomecorroborationofthesechangesfromastudentassessmentalsoconductedatthesametime.Healey(2000)hasalsocautionedthattransferofknowledgefromprogrammesmayalsoleadtoaninitialdropinteachingperformanceasparticipantsgettogripswiththeissuesforchangedpractice.

InacontemporarystudyinAustralia,afteradecadeofprogressiveapproachestoteacherdevelopmentinHE,McArthuret al.(2004)observedfromtheirownresearchthatnodifferencesinsubsequentteachingmethodswerefoundbetweenteacherswhocompletedapostgraduatecertificateandthosewhodidnot.However,theystillconcludedthatsuchprogrammeshadpositiveeffectsbypointingoutthatthey:

… enable less experienced faculty to develop teaching and learning attitudes and methods more quickly than they would without undertaking the postgraduate certificate.(p.10)

Thereareothersmall-scalestudies,notdrawnonhere,whichshowedlimited‘transfer’effectsofprogrammeparticipationtopractice.

Overalltheimpressionoftheavailableresearchevidenceisofalackofcomparabilityinitsfocusandscope.Aparticularlimitationisthattheindicatorseachstudyhasdevelopedforteacherbehavioursinpractice–the‘transfervariables’they

21

haveresearched,arehighlycustomised.Thesedonotprovideforacommonassessmentofwhatbehavioursare(orarenot)morelikelytotransferfromteachingdevelopmentactivities.

3.4 Effects of disciplinary or generic programme focus

TheHEAhasaparticularinterestinanyevidenceofthecontrastingimpactsfromteacherdevelopmentprogrammestakingdisciplinaryasopposedtogeneric(cross-disciplineorthemed)emphases.Thereviewsuggeststhisevidenceisnotstrong.ThedistinctionisnonethelessimportantandGibbshasrecently(2012)talkedoftheprogressionfromwhathereferstoas“disciplinesensitive”programmesaspartofeducationaldevelopmentactivitiesbecomingmoresensitivetoteaching–andsub-institutional–contexts.Hecontrastswhathadbeenapolarisedapproachtothefocusofprogrammesbetween:

… two parallel teaching development movements … generic, centrally located within universities, with specialisms concerned with educational (not disciplinary) domains such as educational uses of technology. The other has been disciplinary, often positioned as offshoots of national disciplinary associations … These movements have had little to do with each other. (p.9)

Thereviewsuggeststhatalthoughtherehasbeenavarietyofdisciplinary-focusedresearch,littlehasbeentrulycomparative.AnexceptionisanimportantEuropeanstudyacrossFinlandandtheUK(204teachersfromFinlandand136fromtheUK)whereLindblom-Ylänneet al.(2006),oneofthefewstudiesthatlooksspecificallyacrossdisciplinaryboundaries,exploredtherelationshipbetweenacademicdisciplineanduniversityteachers’approachestoteaching.Theyalsolookedatthecontrastingeffectsofteachingcontextonconceptualapproachestoteaching.Theiranalysesshowedvariationinstudentandteacher-focusedapproachesacrossdisciplines(andteachingcontexts),withteacherswhoexperiencedifferentcontextsadoptingsomecontrastingteachingapproachesinthosecontexts.Theynotedalsothatteachersfrom‘hard’disciplines,suchasPhysics,weremorelikelytoreportamoreteacher-focusedapproach,whereasthoseteaching‘soft’disciplines,forexampleMediaStudies,weremorestudent-focused.Theseresultssupportpreviousevidenceoninterdisciplinarycontrastsinteachingapproaches,notablybyLueddeke(2003).

Beyondthis,theavailableresearchusuallyreflectsthespecificfocusofdisciplinary-centredprogrammes(SowersandSmith,2003;Brawneret al.,2002).Thereisasubstantialclusterofevidencehereforthelonger-establishednatureoftheteachingdevelopmentofmedicaleducators.HereSteinertet al.(2006)alsoputtogetherasynthesisofresearchonfacultydevelopmentinmedicaleducationfrom1980-2002,whichshowedadirecteffectofprogrammesonlearner-centredteachingskills.

Beyondspecificdiscipline-basedprogrammereviews,comparativeresearchevidencethatcontraststheeffectivenessorimpactsofdisciplinaryandgenericapproachesseemsverylimited.Theoftensmallorquasi-experimentalscaleofmuchoftheidentifiedresearchdoesnotlenditselftodisciplinarycontrastsandthelackofcomparativeapproaches–asnotedabove–meansthatstudiesarebasedonprogrammeswitheitheragenericoradisciplinaryemphasis.However,someoftheresearchdoescommentonthevalueofgenericorinterdisciplinarylearninginthedesignofprogrammes.Donnelly(2007)forexample,inhisstudyofthePostgraduateCertificateinThirdLevelLearningandTeachinginaHEIintheRepublicofIreland,suggeststheintegrationoftheoryandpracticeisenhancedbyaninterdisciplinarydesign,inparticularfornewHEteachers.

IntheUS,therecentresearchbyPersellinandGoodrick(2010)ontheteachingworkshopsprogrammeoftheAssociatedCollegesoftheSouthsuggeststhatbyadoptingacross-disciplinaryapproach,theprogrammeprovidesforafocusorientatedtowardslearners,andnotdominatedbydisciplinarycolleagueswith‘masteryofthe

22

field’.IntheUK,asimilarobservationhasbeenmadebyButcherandStoncel(2012)lookingataninstitutionalapproachtothePGCHEteachers.Theirresearchsuggestedthatparticipantsembracedashiftfromteacher-centredtolearner-centredapproachesinpartassociatedwith“thebenefitsofinter-disciplinarydiscourse”.

However,suchreflectionsseemtobefewandfarbetween.Inaddition,afinalreflectionreturnstoGibbs,whoraisestheissueofwhatsignificanceshouldbeattachedtothisdualism,andthevalueofintegratingbothgenericversusdisciplinaryemphases(Gibbs,2012).Hesuggeststhatmostrecentlycentrallearningandteachingcentreswithininstitutionsoffering‘generic’skillstraininghavebecomemoreawareofdisciplinarydifferences,whichgobeyonddisciplinarypedagogiestoembracedifferentdisciplinaryculturesinhowteachingistalkedaboutandchanged.Ifso,ausefulresearchfocus–althoughonenotyetevident–wouldbeontheeffectivenessandimpactsofmoreintegratedprogrammearrangementsthatcombinegenericemphaseswithaspectsofdisciplinarydifferentiationorsensitivity.

3.5 Compulsory versus voluntary participation in programmes

IntheUK,participationingenericteachingdevelopmentactivitieshasbeenmandatoryinsomeinstitutionsfornewlyappointedorearly-careeracademics.Inothers,mandatoryparticipationhasbeendiscipline-specific,forexampleinmedicaleducationfocusingondevelopingteachingskillsforrecentlyqualifiedclinicians.However,whileaccesstoteachingdevelopmentforearly-careeracademicsiswidespread,anymandatoryelementhasbeenlefttoinstitutionalstrategies.CompulsionhasnotbeendrivenbynationalframeworksintheUK,orontheevidenceofthisreview,elsewhere.

Againstthisbackground,muchoftheavailableresearchhasnotlookedatcontrastingimpactsfromcompulsoryandvoluntaryparticipation.However,therehasbeensomerelatedevidenceoftheeffectsofparticipantmotivation,whichhassomevalueinreflectingonthelikelyeffectsoftheinfluenceofcompulsoryversusvoluntaryengagementinprogrammes.InthisDeRijdtet al.(2012),reviewingaseriesofevaluationstudiesofteacherdevelopmentprogrammes,found“participantmotivation”tobethemostcommonlyreportedinfluencingvariableontheimpactofstaffdevelopment–althoughwithaneffectthatwasmorecommonlyreportedformotivationtolearnandlesscommonlyformotivationtotransferlearningtopractice.TheauthorscontrastedthiswithresearchonstaffdevelopmentinotheractivitiesoutsideteachingandHEandwheremotivationtolearn,ormotivationtotransferlearningtopractice,werenotthemostcommoninfluencingvariablesontheimpactofstaffdevelopment.Theyconcludedthatmoreresearchwasnecessaryonhowmotivationworksasaninfluencingfactorinprofessionalstaffdevelopment.

3.6 Student impact of teaching development programmes

IntheStesandcolleaguesreview(2010a)of37studies,sevenhaddataspecifictotheimpactonstudents’experiencewithfiveofthesefromquantitativeassessment.Lookingmorewidelyconfirmstheirassessmentthatthisisapoorlydevelopedareaforunderstandingprogrammeimpact,althoughalsoonewithdistinctmethodologicalchallenges.

Therehavenonethelessbeensomeimportantcontributionsprovidingstudentimpactevidenceaswellassomemethodologicalinsightsandpotentialtoolsfortheuseofothersinaddressingmethodologicalchallenges.Fourofthemorerecentstudieshavebeenlookedatinsomedetail–althoughnotallwerecoveredbytheSteset al.review:

• Trigwell,ProsserandWaterhouse(1999)reviewtheassociationbetweenteachers’approachestoteachingandstudents’approachestolearning;

• Ho,WatkinsandKelly(2001)useapre-testandtwostagedpost-teststoassessimpactsonstudents;

23

• GibbsandCoffey(2004)reportonastudyontheeffectivenessofuniversityteachers’traininginvolving22universitiesineightcountries,withparallelstudentimpactevidence;

• Ménard,Legault,BenRhouma,DionandMeunier(2011)describetheearlyphaseofaprojecttodeterminetheimpactofundertakingteacherdevelopmenttrainingincludingonstudentsinCanada(Quebecois).

Trigwell, Prosser and Waterhouse (1999): Thisstudyinparticularsetouttoestablishthenatureoftherelationshipbetweenteachers’approachestoteachingandstudents’approachestolearning.Theysurveyedthestudentsandteachersineachof48first-yearscienceclasses.Thestudyadoptedamodified‘tracking-back’approachwhoseutilityhassincebeenadvocatedonthebasisofmorebroadlybasedandresearch-centredteachingandlearningimpactassessment.Consequently,inadvanceofthetopicbeingtaughttheresearchersharnessedtwogenerictoolstoprovidebaselineevidence,withteachersaskedtocompletetheATIandstudentsaversionoftheStudyProcessQuestionnaire(SPQ).Thesetoolshadbeenpreviouslydeveloped(respectivelybyTrigwellandProsser,1996;ProsserandTrigwell,1998(ATI),andbyBiggs,1987(SPQ),withSPQmodifiedtothecontextofthisstudy.

Inthisstudy,datawereanalysedintwophases,factoranalysisandclusteranalysisatclasslevel,andresultsshowedaclearrelationshipbetweentheteachers’approachestoteachingandthestudents’approachestolearning.TeacherswhodescribedtheirteachingasITTFweremorelikelytobeteachingstudentswhoreportedadoptingasurfaceapproach.Thisempiricalevidencesupportedearlierresearchthatconsistentlyshowedthatsurfaceapproachestolearningarerelatedtolowerqualitylearningoutcomes(MartonandSäljö,1976;VanRossumandSchenk,1984;TrigwellandProsser,1991;Ramsden,1992;ProsserandMillar,1989).TheresearchindicatesarelationbetweentheapproachtoteachingandthequalityofstudentoutcomesandprovidedafoundationformuchsubsequentdevelopmentofHECPD.

Ho, Watkins and Kelly (2001):Evidenceofstudentoutcomeswasprovidedfromasmall-scalein-depthstudyoftheeffectivenessoftheconceptualchangeapproachtoteacherdevelopmentprogrammes.Theeffectoftheprogrammeontheparticipants’teachingconceptionswasassessedbyidentifyingandcomparingtheconceptionsofteachingoftheparticipantsbeforeandaftertheprogramme.Threesemi-structuredandstagedinterviews(pre-programme,immediatepost-programmeanddelayedpost-programmeayearafterconcluding)wereused,withtheacademicyear1994-95treatedasacontrolyear,beforetheprogrammehadrun,andtheyear1995-96asthetestyear,whentheparticipantshadbeenthroughtheprogramme.Thestudyalsousedgenerictoolsformeasuringchangedrawingstudents’perceptionsofaparticipant’steachinginhis/herselectedcoursethroughtheCourseExperienceQuestionnaire(CEQ)(Ramsden,1991),whichprovidedforacomparisonofscoresattheendofthecontrolandtestyears.Theconsequentialeffectoftheprogrammeonstudentlearningwasdeterminedbycomparingimpactsofparticipants’teachingonstudents’studyingapproachesinthepre-andpost-programmeyearsusingEntwistle’s(1994)revisedversionoftheApproachestoStudyingInventory(ASI).

Theprogrammebroughtaboutdetectableconceptualchangeorconceptualdevelopmentintwo-thirdsofthesamplegroup.Subsequently,allthe‘changed’teachersreceivedbetterratingsontheirteachingpracticesfromtheirstudentsinthefollowingacademicyear,whilenoneofthosewhodidnotchangetheirconceptionsshowedsimilargainsinstudentratingscores.Aresultantpositiveimpactontheirstudents’studyingapproacheswasobservedforhalfoftheteacherswhochangedtheirconceptions.Althoughthefindingsprovideveryencouragingresultsconcerningtheeffectivenessofaconceptualchangeapproachtostaffdevelopment,theauthorsnotecautionduetotheverysmallsizeofthesample.Additionally,althoughall

24

teacherswhoseconceptionshadchangeddemonstratedasignificantimprovementintheirteachingpracticesasperceivedbytheirstudents,onlyhalfinstitutedachangeintheirteachingpracticestotheextentofinducingapositivechangeintheirstudents’approachestostudy.

Theauthorscontendthat,onthebasisoftheirevidencefromthisstudy,achangeinconceptionsofteachingislikelytoleadtoapromptimprovementinteachingpractice,althoughthisconclusionisnotconsistentwithsomeotherevidenceregardingstudentlearning.Theyconcludethatadvancementinconceptionsofteachingisabasisforimprovementinteachingpracticesandsupportsotherpredictionsthat,ifteachers’conceptionsofteachingaredevelopedtoahigherlevel,theirteachingpracticesshouldimproveaccordingly(Bowden,1989;GowandKember,1993;Gibbs,1995a;Ramsden,1992;Trigwell,1995).However,theynotethatothermodelsexistintheliteraturetosuggestthatconceptualchangestakeplaceafter,ratherthanprior,tochangesinpractice(Guskey,1986)andthatintheliteraturethereisanabsenceofempiricalevidencethatdevelopmentinconceptionsofteachingwillleadpromptlytoanimprovementinteachingpractice.

Gibbs and Coffey (2004): Athirdandimportantsourceofbroaderevidenceonstudentimpactscomesfromoneofthefewcross-nationalreviews.Inthis,GibbsandCoffeyreportastudyontheeffectivenessofuniversityteachers’traininginvolving22universitiesineightcountries.Thestudyfocusedononegroupofteachersandtheirstudentsatthestartoftheirtrainingandayearlater,togetherwithacontrolgroup,whichreceivednotraining.EvidencewasgatheredusingsixscalesfromtheStudentEvaluationofEducationalQualityquestionnaire(SEEQ)(Marsh,1982;CoffeyandGibbs,2000)andthe‘GoodTeaching’scaleoftheModuleExperienceQuestionnaire(MEQ)(Ramsden,1991;GibbsandCoffey,2004)toestablishstudentratingsfortheirteachers.

AsintheTrigwellet al.1999study,theGibbsandCoffeystudyalsoappliedtheATItool(TrigwellandProsser,2004),althoughthroughamodifiedapproach.Consequently,thisappliedtheATItoestablishtheextenttowhichteachersdescribedthemselvesasteacher-focusedandstudent-focusedintheirapproachtoteaching(Trigwellet al.,1999;Trigwellet al.,2000),togetherwithtwoscalesfromtheMEQtoestablishtheextenttowhichthestudentsofthoseteachersadoptedasurfaceapproachandadeepapproachtolearning.Theauthorsreportthattheyfoundevidenceofarangeofpositivechangesinteachersinthetraininggroup,aswellasintheirstudents.Setagainstthis,theanalysisfromthecontrolgroupshowedacontrastinglackofchange,ornegativechanges,innon-participantteachers.

Ménard, Legault, Ben Rhouma, Dion and Meunier (2011): Ménardet al.describetheearlyphaseofaprojecttodeterminetheimpactofundertakingteacherdevelopmenttrainingonnewteachingstaffandstudentsatfoundationHElevel(specificallyinQuebecoisCégeps).TheonlyformalrequirementtoteachatthislevelisaBachelorsdegreeinthespecificsubject(althoughmanynewstaffhaveMastersordoctorates),butnoformalteachingtrainingisrequired.Anumberofuniversitiesarethereforeofferinga15-to30-creditprogrammetosupportnewteachingstaff.Asthisprogrammeisnotcompulsoryandthereisnoempiricalevidenceontheimpactofthisteachertraining,Ménardet al.havesetupanevaluationprogramme.Theevidencethusfarisbasedontheearlystagesoftheresearchprogramme,notleastthechallengesoftranslatingAmericanmeasurementmethodstoadifferentlanguageandeducationsystem.

Tounderstandthebenefitstostudents,Ménardet al.translatedtheMotivatedStrategiesforLearningQuestionnaire(MSLQ)widelyusedintheUSanditerativelytestedversionsofthetranslatedquestionnairewithstudents.TheanalysisoftheresultsforstudentshasnotprovedasrobustashopedinthefirstyearofapplicationandtheauthorshavehadtofurthermodifytheirtranslationsoftheMSLQforthenextcohortofstudents.Theirintentistoidentifywhetherthereisabenefittostudentsinordertoinformthedebateastowhetherteachertrainingshouldbecompulsoryatthislevelofeducation.

25

Other sources: Afurthersourceofevidenceregardingtheimpactonstudents’experiencedrawsonwhatmightbereferredtoasproxystudies,whereprogrammeparticipantsareaskedtoassesseffectsonstudentlearning.Asmallnumberofotherstudieshaveusedtheseapproaches–oftenonverysmall-scalesamples,butarenotreviewedindetailhere.Suchapproachestoassessingstudentoutcomesmaylackreliabilityandwouldbecontroversialasasolesourceofevidenceonstudentoutcomes.However,theycanproduceusefulinsightsandespeciallywhenappliedtodefinedtarget‘end-user’groups(SowersandSmith,2003).

3.7 Other programme impacts

TheHEArequestedthisreviewtoassessanydistinctiveevidenceforotherimpactsandeffectsandinparticularfor:

• programmescentredonyoungerteachersorthosenewlyenteringorpre-entrytotheprofession(e.g.graduatestudentprogrammes);

• contrastingeffectsofprogrammesbasedonnationalframeworksandotherswithadhocornolinkstoframeworks/externalstandards;

• theinfluenceofconceptualisationofteacherdevelopmentandinparticularofformsofinstructionaldevelopmentonprogrammeimpact;

• effectsofteachingdevelopmentprogrammesonthestatusofteaching,andteachers’involvementinthescholarshipofteachingandlearning.

Evidenceintheseareaslacksdifferentiation,andisoftenverylimited,itisconsequentlytouchedupononlybrieflyhere.

Programmes for younger teachers or new entrants: ManyoftheprogrammesandinitiativesthathaveattractedresearchinteresthavehadafocusonyoungerHEteachers.SomehavesuggestedthattheseremainthedominantfocusinNorthAmericaninstitutionalinitiatives(KreberandBrook,2001)ormorewidelyincludinginEurope(Gibbs,2012;Prebbleet al.,2004).IntheUK,theprofessionalstandardsframeworkmeansthatfornewacademicstaffthequalificationpathwayisincreasinglybecomingafeatureofinstitutionalstrategies,althoughwithlessimpactforestablishedstaff.Tothiscanbeaddedprovisionforaspiringacademicsincludingthroughgraduateteachingsupportorassistantdevelopmentprogrammes.

ButcherandStoncel(2012)haveprovidedrecentevidencethroughaninstitutionalcasestudyapproachtoexplorethenatureandextentoftheimpactoftheUK’sPGCHEteachersappointedfortheirprofessionalexpertise.Datawerecollectedinfouriterativestagestoinvestigateperceptionsofgraduatesfromthecourse(2006-2009),aswellascurrentparticipants,midwaythroughtheirprogramme.Theresearchinvolvedmixedmethodsandshowedthat‘new’teacherswerewillingtoadoptnovelapproachestoteaching,planningandassessment,withashiftfromteacher-centredtolearner-centredapproaches.Tothiswasaddedevidenceofchangesinperceptionsofself-efficacyandprofessionalidentity.Impactswerediscernibleatseverallevels–atindividualanddepartmentallevels,onthestudentexperience,andonparticipants’careers.Beyondthis,however,fewstudiesintheUKorelsewhereseemtodistinguishbetweenprogrammes(orqualifications)thatarewhollycentredonneworyoungentrants,ormakeprovisionintheanalysisfordistinguishingeffectsonolderornewerentrants(e.g.wherecareerlengthisadeterminantvariable).

OneEuropeanstudyhaslookedatstudentoutcomesspecificallyforteacherdevelopmentfornewlyappointeduniversityteachers(Steset al.,2010b).Inthis,therewaslimitedpositiveevidenceoftheeffectsofteachers’instructionaldevelopmentonstudentlearning.Theirstudycollectedandanalysedquantitativedatafrommorethan

26

1,000studentsatpre-andpost-tests,usingaquasi-experimentaldesign.Amulti-levelanalysiswasconductedinwhichfivemodelswereestimated,oneofwhichproducednegativeimpact.

TheauthorsofthisstudyconcludedthatinstructionaldevelopmentforneworaspiringteachersinHEdoesnoteasilytranslateintomeasurableimproved‘earlycareer’practice.Theypointinparticulartothedifficultiesoffindingmeasurableeffectsonstudents’perceptionsofchangeandintheteachingandlearningenvironment.

Inaddition,theevaluationoftheMid-CareerTeachingProgrammedevelopedin1998bytheCTLSattheUniversityofMinnesota(Romanoet al.,2004)focusedonmatureteachersinHE(i.e.mid-career).However,thislackedacontrolorcomparisongroup,forexampleamongyoungeracademics.

Contrasting effects of programmes based on national frameworks and/or external standards: Thereissomeevidencethattrainingtonationalorotherprofessionalstandardsdoesproduceaddedvalue(GibbsandCoffey,2004;Nasret al.,1996;Haigh,2012).However,theevidenceispredominantlysituational,andlacksanycontrollingforadditionalityofeffects.Beyondthis,wehavebeenabletofindnoevidenceofresearchthathascommentedspecificallyonthedistinctiveeffectsofprogrammesbasedoneitherexternal(professional)standardsornationalframeworks.

However,notallresearchershaveagreedwiththevalueofstandards-basedapproaches.Orr(2008),forexample,lookingatprofessionaldevelopmentofUKfurthereducationlecturers(includingthoseworkingonfoundationdegrees),hasconcludedthatexternallyimposedframeworksreflected“restrictiveandimpoverishednotionsofprofessionalism”.Hisworkledhimtobelievethatmoremeaningfulandautonomousprofessionalism,institutionallyorsub-institutionallyfocused,mayevolveifteachersarepermittedtoselecttheirownCPDagendas,ratherthanhavingsuchchoicelimitedbyprescribedstandardsandcontent.However,thetransferabilityofhisconclusionstomainstreamHEcontextsisunclear.

The influence of different forms of instructional development on programme impact: Theevidencebasehereislimited.Ashasbeendemonstratedmuchoftheresearchbaseisnotcomparativeandemphasisesreviewsofspecificprogrammes–oftencentredonasingleinterventionsuchasparticipationinaninstructiondevelopmentcourse,workshopsorseminars,activeobservationorreflectiveportfoliomethods.Evidencetocontrasttheeffectsofdifferentprogrammemodalitiesisconsequentlylimitedtothehandfulofstudieslookingacrossvariousindividualstudies.

Inlookingatmodalities,Steinertandcolleagues(2006)suggestedthatcollectivedelivery,typicallythroughastructuredcourse,hadproveneffectiveinchangingteachingattitudes.However,theassessmentwascentredonmedicaleducationandwherethecontextofdeliverywasperhapsatypicalofotherareasof‘academic’staffdevelopmentinHE.

Therecentstudyacross36teachingdevelopmentinitiativesbyStesandcolleagueswasdrawnacrossavarietyofdisciplinesandcontextsandwasabletocontrastimpactsforteachers’learningandbehaviour,institutionalimpactandimpactonstudentlearningbymodality(Steset al.,2012).Theresults,however,werenotclearonrelativeimpacts.Indeed,theauthorsconcludedthat‘collective’(i.e.course-like)teachingdevelopmentinitiatives,whilehavingpositiveimpactsonteachers’learningand(wheretherewasevidence)onstudentlearning,hadlesscommonimpactsontranslatingtheknowledgeandskillslearnedtoteachingbehavioursandpractice.Thesameanalysisalsoshowedthatteachingdevelopmentinitiativesthatdidnotrelyoncollectivecourse-baseddelivery,orwerehybridintheirapproaches(e.g.combiningcollectivedeliverywithparticipantmentoring)hadstrongereffectsonteacherbehaviour.However,asStesandcolleaguespointedout,thesefindingsseemtobecontradictory,sinceevidenceofimpactonstudentscouldonlycomeaboutbyeffectivetransferofknowledgetoteacherpractice.Theauthorsconcludedthattheevidencebasewastoosmalltogeneraliseatthislevel.

27

Effects of programmes on the teaching status and involvement in the scholarship of teaching and learning: Thesetwoissuesareinterrelatedandincludedtogetherhere.However,forboththereislimitedapparentevidencefromavailableresearch.Steinertandcolleagues,inacross-disciplinereviewofteachingstaffdevelopmentinmedicaleducation,establishedthatevaluationofteachingdevelopmentprogrammesinthisareahadapositiveimpactonteachers’attitudestowardstheutilityofsuchdevelopment(Steinertet al.,2006).However,beyondthis,thereisverylittledistinctiveevidenceoftheeffectsofteacherdevelopmentprogrammesonsocioculturalfactorssuchasthestatusofteachinginHE.

Ontheissueoftheimpactofprogrammesonengagementwithscholarship,BrewandGinns(2008),usingtheScholarshipIndex(SI),havedocumentedapositiverelationshipbetweenengagementinthescholarshipofteachingandstudents’courseexperienceatthedepartment/facultylevel.Thiswasamulti-indicatorassessmentof‘scholarshipengagement’,whichincludedgraduateteachingcertificateprogrammes,butwentbeyondtoincludeSImeasuressuchasteachingawards,publicationsandpresentationsonuniversityteachingmethods.Assuchitlackedamorespecificassociationtoteacherdevelopmentinitiatives.However,theyconcludedthatwhereadepartment/facultywashighlyengagedinthescholarshipofteaching,studentsinthatfaculty/departmentweresignificantlymorelikelytohavedescribedexperiencinghigherqualitycourses.

Beyondthiswehavefoundlittleevidenceseekingtorelateteacherengagementinsuchprogrammeswiththeirinvolvementinthescholarshipofteachingandlearning.Thisisnottosaythatnosuchrelationshipsexist,indeedthebloomingresearchbaseofthelastdecadeinparticularmightinformallysuggestapositiveassociation,butthereseemstobenosystematicevidenceasyettoconfirmsuchalink.

28

Section 4: Impact research methods and models 4.1 Introduction

Animportantfocusofthereviewhasbeentolookbeyondtheevidenceoffindingstobetterunderstandtheeffectivenessofpreviousresearchconducted,includingtheuseofdifferentapproachesandtools.Thisisdrawntogetherheretolookat:

• thestrengthsandmeritsoftheavailableevidenceofimpactassessment;• opportunitiesforimprovementofimpactevidenceofteachingdevelopment

programmes;• practicalchallengesevidentfromimpactassessmentandevaluation.

Setagainstthecontextofindividualstudies,thispartofthereviewprovidesacriticalassessmentofthemethodsusedandalsodrawsonscholarlycommentaryontherobustnessandutilityoftheseapproaches.However,assomeofthesecommentatorshavethemselvesnoted(Steset al,2010a;Trigwell,2012),thepublishedsourcesarehighlyvariableintheirdepthanddetailintheirdescriptionsofmethodologyandthislimitsthisassessment.Ausefulstartingpoint,however,istolookatoverarchingframeworksthathavebeenproposedandusedforguidingresearchstrategies.

4.2 Impact assessment models and frameworks

Thisreviewhasdrawntogetheradiverserangeofstudies,albeitdrawnfromverydifferentprogrammeandculturalcontextsandcontrastingmethodologicalfoundations.Althoughsomestudieshaveusedcommontools(Trigwellet al.,1999;Hoet al.,2001;GibbsandCoffey,2004)orvariantsofthese,aparticularchallengeisthedifficultyofestablishingcomparabilityofstudyresults.Severaloftheresearchers,notablySteset al.(2010a),reviewingpastevidenceorreflectingontheirownanalyseshavecometotheconclusionthatcommonindicatorsoraframeworkforchoosingandapplyingthese,isneededtoenablestudiestobuilduponeachother.

Anysuchframeworkwouldneedtobesensitivetoprogrammecontexts,andtheneedforresearcherstotakeintoaccounttheindividualdifferencesofteachersparticipatinginstaffdevelopmentinitiativesishighlightedbyLevinson-RoseandMenges(1981).WeimerandLenze(1998)similarlydrawattentiontotheimportanceincomparativestudiesoftakingintoaccountfieldsrelatedtoHE.Thedevelopingscholarlyinterestinthefieldhasbeenaccompaniedbyproposalsfordifferentevaluationframeworksandmodels(VanNoteChismandSzabó,1997;Guskey,2000)andfromothersdrawnmorewidelythanHE(Kirkpatrick,1998).

Guskey’smodel(2000)hasparticularrelevancetothisreviewsinceitsuggestedimpactfromteacherdevelopmentprogrammeswouldbeatfivedifferentlevels:

• academicreactions;

• participants’conceptualchange(teachingknowledge,beliefsandperceptions);

• participants’behaviouralchange(changingpracticeanduseofskills/techniquesanddifferentlearningstrategies);

29

• developmentandchangeinorganisationalsupportforteacherdevelopment;

• changestostudentlearningandperformance.

Althoughnotcentredonfacultydevelopmentalone,thesefivelevelsseemtoprovideforavaluableandpracticalanalyticalframework.Trigwell(2012)hassinceexploredtheapplicationoftheselevelstoteacherdevelopmentcontextsandhasestablishedthatfromavailable(published)studiestheevidencebaseisstrongestforthesecondlevel(conceptualchange).Hehasalsosuggestedthatthesestudiesprovidesomeevidenceforthethird(practicechange),butareweakerforchangestostudentlearning(andoftendependentonproxymeasures)andweakerstillfororganisationalchange.

AslightlylaterattempttomodelanapproachtoimpactevaluationwasproposedbyKreberandBrook(2001)andthiswascentredonfacultydevelopment,drawingonabroadreviewofNorthAmericanstudiesevaluatingHE-basedteachingdevelopmentprogrammes.TheapproachwasnotalignedwiththeGuskeymodel,butsharedsimilarfeatures.Itsetoutasix-levelframeworkwiththreebasedonparticipantsoutcomes(perceptions/satisfaction;teachingbeliefs;teachingperformance),twoonstudentoutcomes(perceptionsofstaffs’teachingperformance;learningoutcomes)andoneonorganisationculture.Eachlevelwassaidtobecapableofbeingevaluatedseparatelyortogether.TheKreberandBrookmodelalsosetoutdifferentimplicationsforevaluationdesignforfiveofthemostcommonfacultydevelopmentactivities:

• structured(usuallycentral)course(s)onlearningtoteachinHE;• individualconsultationswithteachingstaffincludingmentoringandone-to-one

coaching;• seminarsandworkshops;• collaborativeactionresearchstudiesonteaching;• peerconsultationprogrammes.

TheKreberandBrookmodelsuggestsaframeworkwherebyimpactcanbeevaluatedforanyofsixhypothesisedlevelswithineachofthefiveconstituentactivities(asabove).Italsosetsoutseparatedesignissueswithineach.UnliketheGuskeymodel,thereisnoevidenceofliteraturethatbenchmarkssubsequentevaluativepracticeagainstthismodelorotherwiseteststheutilityoftheapproach.

Amorerecentinterdisciplinaryresearchreviewhastakenadifferentapproachtobuildingaconceptualframework(DeRijdtet al.,2012).Thishaslookedwidelyacrossevaluativeframeworks,drawinguponmanagement,HRDandorganisationalpsychologicalresearchasfieldsofresearchseentobemostcloselyrelatedtoeducationandstaffdevelopment.Throughthis,theauthorsproposeaconceptualframeworkbasedondefining influencing variables and moderating variablesthathavethepotentialtoaffectthetransferoflearninginstaffdevelopmentactivities.Theybasethisapproachontheassessmentthattransferofprofessionallearningtotheworkplaceiscomplexandnoteasytoachieveandcitemanagementstudiesasestablishingthatonly10%oflearningactuallytransferstojobperformance(Fitzpatrick,2001;HoltonandBaldwin,2000;Kupritz,2002).

TheDeRijdtet al.modelalsodrawsattentiontothefactthattheresearchdesignofthestudiesmeasuringtransferofstaffdevelopmentlearningcanhaveanimpactontheoutcomesreported.Theysuggeststronglythattheresearchdesignofstudiesintotransferoflearningshouldchangefromashort-termmeasuretoagestationperiodof12months.

Withwhatseemsahandfulofexceptions,thesemodels–earlierandmorerecent–seemtohavehadremarkablylittleeffectonresearchorevaluationpractice.Theyremain,fortheassessmentofimpactinteacherdevelopment,essentiallytheoreticalmodelsthatmayhaveutilityinbuildingsubsequentapproaches,butareessentiallyuntriedasmethodologicalframeworks.

30

Theconditionsaffectingthedevelopmentofresearchorevaluationapproaches,andtheprecisemethodologiesadoptedbyresearchers,arerarelyclearfromthepublishedresearch.However,thediversityinvolvedseemstosuggestthatmethodologyisdeterminedbytheessentiallylocaliseddimensionofmuchoftheevaluativefocus,withapproachesadoptedbeinghighlycustomisedandessentiallypragmatic.

4.3 Evidence strengths and limitations

Thisbroadreviewhaslookedacrossnumerousstudies,conductedindifferentcontexts,andin11differentcountries,andprovidesausefulstartingpointtoassesscommonstrengthsandmeritsofresearchapproaches–andalsolimitations.

Strengths: Lookingacrosstheevidencebasesuggeststhreebroadstrengths.Thefirstoftheseisthesheerdiversityoftheavailableevidence.Whilethisoftenlacksaccessibilityandprovideschallengesinitscomparability,thenumberofstudiesmeansthatnumerousresearchmethodshavebeendevelopedandtriedandthisinturnprovidesawide(ifnotalways)deepmethodologicalfoundationfromwhichfutureresearcherscanlearn.

Asecondandrelatedmeritoftheevidencebaseisthatitisinternational.Althoughlackingmorethanahandfulofcross-nationalstudies,thismeansthattheresearchmethodsdevelopedhavebeentriedonmanydifferentinstitutionalcontexts,andthisinturnprovidesforsomefuture-proofinginchangingnationalcontexts.

Thethirdstrength,andaconsistentmeritoftheresearchbase,isthatitishighlyappliedandpractical.Thisappliedfocusisaidedbythefactthatindividualstudiesarecommonlylookingtomeasure,anddescribe,practicaleffectsofparticipationandrealisedbenefitsforparticipants.

Limitations: Whiletheseareimportantstrengths,onwhichotherscholarlyinquirycanbuild,forthepurposesofthisreviewtheevidencebasealsohasmanylimitations.Oneissueisthatthebreadthoftheavailableresearchiscounterbalancedbythesmallscaleofmuchoftheresearch,especiallyfromtheUSandCanada,whichcentreonimpactsofspecific(individual)programmesandusuallyislimitedtoassessmentsataspecificpointoftime,whichsoonbecomeout-datedasprogramme(andHE)contextschange.Thebreadth(andquality)ofthisevidencebaseisconsequentlypatchy,aswellasbeinghighlyfragmented.Almostbydefinitionthismeanstheevidencelackscomparability.

Thisconsequenceforthisreviewisthatthesourceevidencelimitswhatcanbesaidaboutthecharacteristicsofprogrammeimpact.Whereitexists,theavailableimpactevidenceisoftenhighlygeneralised,mayhaveshallowroots,anduncommonlydevelopsorharnessesrobustquantitativemeasures(ortoolstomobilisethese).Inparticular,theavailableevidenceshows:

• Theevidenceavailableseemsmuchstrongeronseekingtoisolatehigh-leveleffectsonteacherparticipants.Inthistheevidenceisstrongestregardingchangestoteacherattitudestoandconceptionsofteaching.Here,theStesreview(2010a)providesavaluableappraisalacross27(ortheir37reviewed)studiesofmethodsandutilityweconcludehasmuchwiderrelevance.

• Muchoftheavailableresearchfocusesondocumentingeffectsandnotchangesresultingfromprogrammeparticipation.Whileuseful,thecommonlackofbaselinemeasures(e.g.pre-tests)meansthisprovidesforaverylimitedunderstandingofthe‘net’effectsofprogrammedeliveryandverylittleevidenceoftheaddedvalueofparticipation.Since1981leadingcommentators(Levinson-RoseandMenges,1981)havebeencallingformorerobustandsystematicresearchmethodsthatcouldbetterunderstand‘neteffects’,butwithapparentlylittleeffectontheresearchstrategiesusedinsubsequentstudies.

31

• Theavailableevidenceisdominatedbywhatamounttosnapshotmeasures–oftentakenatorveryshortlyaftercompletionofprogrammeparticipation.Stesandcolleagues(2010a)lookingacrossaseriesofstudieshavedrawnattentiontothelimitationsinmethodologiesusedbyresearcherstoassessing–andcontrollingfor–programmeeffects.Thisisanimportantlimitation,nottheleastbecausethefewstudiesthathavetakenalookateffectssixmonthsorayearafterwards(suchasPostareff,2007),indicate‘slow-burn’effectsandeventhepossibilityofregressedqualityofpracticeasparticipantsgainconfidenceand/orpedagogicbreadth.Consequently,‘snapshot’studiesriskprovidingamisleadingpictureofeithersustainedchangestoattitudesoractualtransfertopractice.Policyorotherdecision-makersrelyingonsuchevidenceatinstitutionallevelandabovemaybemakingjudgementsonfutureinvestmentsordeliveryapproachesbasedonunsoundorprematureevidence.

• Thisreviewconcludesthatwhilechangestoteacherknowledgeandskillshavebeenacommonimpactfocusinresearch,theevidenceprovidedisoftenbasedonveryspecificorhighlygeneralisedmeasures.Thiswasseenasaweaknessinearlierstudies(Levinson-RoseandMenges,1981)andseemstohavechangedlittlesince.Acombinationofthelimitedevidencebaseofknowledgeandskillsimpactsandthegreatdiversityinwhatismeasured,meansthatthisevidencelackscomparabilityandprovidesaweakfoundationforsharingknowledgeontheskillsdevelopmenteffectivenessofprogrammes.

• Parallelevidenceoftheimpactstoteacherpracticecomingfromthetransferofchangedattitudes,conceptions,knowledgeandskills,isalsolackinginsubstanceandbreadth.Stesandcolleagues(2010aand2010b)havecommentedonthelimitationsofself-assessmentevidencetakenat,orsoonafter,participatinginprogrammes,andalsothepaucityofdeferredpost-testevidence.Whatevidenceisavailablealsoprovidesformixedmessagesabouttheeffectivenessofknowledgetransfertopractice–makingtheneedformorerobustassessmentallthemoreimportant.Trigwellproposesthatanewandmoresystematicevaluationapproach,basedoncontextualisedevidence,isneededtoshapesuchafocus(Trigwell,2012).

Theseareimportantlimitations,whichwouldneedtobetakenintoaccountinanyfuturenationalstimulustofurtherresearchandevaluation.

4.4 Specific evidence improvement opportunities

Inadditiontosomeofthemethodologicalgapssetoutabove,thestudiesalsosuggestsomepossibleprogrammeeffectsthatarebasedonevidencethatiseithershalloworcontradictory,wheremoreresearchisneededtoidentifyeffectsandmechanisms.Theavailableevidencesuggeststhismightusefullyfocusonthreemainareas:

Programme delivery/modality:

• Longerlengthofprogrammesand/ordurationofactivitiespositivelyaffectthequalityoflearningandtransferpotential,buttheextentofthisfordifferentprogrammemodalitiesisuncertain.

• Staffdevelopmentinterventionsextendedovertimeshowmorepositiveresultsoftransferoflearningthanone-timeinterventions.

• Themodalityandnatureofthestaffdevelopmentinterventionconditionsthequalityofimpact,withon-the-joblearninghavingamorepositiveimpactontransferoflearningthan‘off-shore’programmes.However,thereislimitedevidenceofwhythisisthecaseandwhattheinfluencingmechanismsare.

• Thereislittlerobustevidencetounderstandwhatarethewidermotivatorsandinfluencestoknowledgetransferfromprogrammeparticipationtochangedteachingbehaviourandpractice.

32

Programme recruitment and participation:

• Experiencedteachersshowmoreandearliertransferoflearningfromprogrammesaimedatteachingdevelopment(i.e.topractice)thantheirnovicecolleagues.

• Noviceteachersmayrequireacriticalmassofteachingdevelopmentinputorof‘foundation’pedagogicknowledgeandunderstandingbeforetransferpotentialbecomespositive,butthecontentofthis,setagainstdifferentparticipatingstartingpointsanddeliverycontexts,isunclear.

• Noviceteachersalsoseemtoshowmoreresponsetocollaborativearrangementsinprogrammesandthispositivelyaffectstransferoflearningwhereitinvolvescollaborationwithmoreexperiencedcolleagues.

• Noviceteacherswillalsogainmostfromprogrammeswheretheyaresupportedbyothers,includingworkingwithincommunitiesofpractice,butthereislittleevidenceofwhatworksmosteffectivelyinwiderknowledgesharingandsupport.

Student outcomes:

• Thereisasubstantiallackofdirectevidenceonoutcomesforstudents,andinparticularonwhatfeaturesofteacherdevelopmentactivitypositivelyaffectthepotentialforchangeinstudentlearningandoutcomes,andthedeterminantsofthisinfluence.

• Thisseemstobeadeep-seatedinadequacy(i.e.sinceLevinson-RoseandMenges,1981)inmeasuringtheimpactonstudentlearningandoutcomeswithinprogrammeevaluation.Subsequentanalysis(Trigwellet al.,1999;KreberandBrook,2001),mostrecentlybyTrigwell(2012),hassuggestedthisreflectsalackofcommontoolssuchasastudentengagementquestionnaireand/orcommonindicatorsandactionresearchisneededtoinformthedevelopmentoftoolscapableofcustomisationandwideapplication.

Afinalissuecentresspecificallyonthenon-UKevidence.Thefocusofthisreviewhasbeenontheimpactevidence,andwherethishasnotbeenavailablefrompublishedresearchwehavebeenunabletotakeaccountofcross-nationalexperience.Atthesametimetheresearchhasshowntheevidencebasecontinuestodevelop,andsomecountrieshaveestablishedeitherimportantprogrammesand/orarrangementsfornetworkingexperiencethatmightrevealimportantinsightsthroughareviewoftheirexperience,whichisdeeperthanidentifyingpublishedresearch.Thismightprovideimportantlessonsmethodologicallyforhowothershave,orare,developingarrangementsforassessingeffectivenessandeffectsoftheseinvestments.

4.5 Practical challenges in impact evaluation

Thisanalysiswouldnotbecompletewithoutacknowledgingsomeofthechallengespresentedbyimpactevaluationofprogrammesandactivitiessuchasthese.Whiletherehasbeennosystematicassessmentoftheseacrosssuchstudies,supportingevidencecanbedrawnfromwiderexperienceandfromcommentarywithinthepublishedstudies.Thissuggeststhatevaluatorsandresearchersseekingtobetterunderstandprogrammeimpactmayfaceanumberofspecificchallenges:

• Low priority: Unlessthereareoperatingor(external)fundingissuesthatrequireimpactevaluation,individualorcollaboratingHEprovidersmaynothaveanaturalinclinationtoproactivelyestablishit.Inparticular,individualHEIsmaybeundernodirectobligationtoformallyevaluatetheeffectivenessofteachingdevelopmentprogrammes.EvenwhereHEIfundingisdrawnwholly,orpartly,from‘programmes’(suchasCETLs)directlyresourcedbyHE-relatednon-executiveagencies(intheUK),thefocusofsuchagenciesonnotconstraininginstitutionalinitiativeand

33

freedomsseemstohaveresultedinparticipantHEIsnotbeingobligatedtoconductevaluationasaconditionofsecuringfunding.

• Lack of appropriate research funding: PubliclyfundedHE-ledresearchinthesocialsciencesisunderconsiderablecompetitivepressureintheUK,asinotherdevelopedeconomies.However,thesepressuresmayadverselyaffectemergingareasofinquirysuchasHEteachingdevelopmentstudies.Muchoftheresearchdrawnoninthepreviouschapterhasstemmedfromtheinitiativeofindividualinstitutionsandindividualswithinthis.Whilefundingsourcesareoftennotclearfrompublications,exceptwherestemmingfromcharitable,externalgrantaidorpubliclyfundedagencies,theimpressionisthatmanystudiesstemfromlow-budgetorinstitutionallyfundedscholarlyinquiry.Wenotethatevaluationoflarge-scaleandimpact-centredresearchfundingexercisesintheUK,suchastheESRC/HEFCETeachingandLearningResearchProgramme(Parsonset al.,2011)showedthatnoneofits57majorresearchawardsweredirectedatteachingdevelopmentactivitiesinHE.

• Capacity and research experience in the field: Thisqualityofrelevantexperience,andcontinuity,intheresearchbasemaybebeingadverselyaffectedbystaffturnoveramongresearchers.Thiswouldseemtobereflectedinalackofincentiveordurabilityforindividualsinpursuingfurtherresearchinthisarea.Consequently,andalthoughthisisnotasystematicassessment,ofthecitedpublicationsinthisreviewlessthanoneintenauthorsarecitedmorethanonce,yetathirdofthecitations(36)stemfromjustsevenauthorsinternationally7.Theremaybemanyexplanationsforthisapparentclusteringofexperience,butitmaysuggestthatformanycontributorstheengagementinimpactresearchrelatingtotheseprogrammesisnotanenduringaspectoftheirresearchinterests–andperhapstheiracquiredskillsandknowledge.

• Skills in programme evaluation and impact assessment: Whiletheresearchersproducingthecitedpublicationsdrawnattentiontointhisreviewhavenodoubthadsubstantialandinmanycaseslongestablishedresearchskillsandcredentials,manymayhavehadlessexperienceinprogrammeevaluationandimpactassessment.Otherresearch(Parsonset al.,2011)hasshownthatHEresearchersinparticularmaylackskillsinsettingandmeasuringimpactindicators,andthismaybeaconstrainttoappliedresearchersinthisfield.Othersmayfeeltheirresearchskillsarenotwellplacedtotackletheparticularchallengesofimpactattributioninevaluatingteacherdevelopmentprogrammes.

• Constraints to longitudinal analysis: Researchersneedacommitmenttoextendedresearchandevaluationtimetablestobeabletosupportlongitudinalanalysis.Onepossibleconstrainttothiswillbeafocusonsummativeevaluationofprogrammes–whichdoesnotprovideforpost-participationreview,and/orthelimitationsimposedbywhatmaybeessentiallyshort-termfundingforevaluationsorappliedresearch.Itseemslikelythatfundingperiodsofatleast24-36monthswouldbenecessarytosupportlongitudinalanalysis,andthisisunlikelytobesupportedbymanyfundingpathways.

• Generic assessment tools capable of customising to different programme and HE contexts: Manyresearchershavedrawnattentiontothelackofresearchframeworkstoguidetheassessmentofprogrammeimpacts(ButcherandStoncel,2012;KreberandBrook,2001;Postareffet al.,2007;Roxået al.,2011)andothershavedrawnattentiontotheneedforspecificresearchtools–inparticulartoguidetheassessmentofstudentimpacts(Stes,2010a;Trigwell,2012).Whereparticulartoolshavebeenputforwardtheresearch(andtools)maynowbeverydated(Marsh,1982).Thislackofappropriateframeworksmayreinforcetheconstraintsonskillsandexperiencetouchedonabove,andwillbecompoundedbythelackofgenericorcustomisabletools,withresearchersfacing‘reinventingthewheel’foreachprogrammeassessmentorwiderresearch.

7 Includingauthorcitationsinmultipleauthorpapers.

34

• Academic priorities:IntheUKatleast,timepressureson,andprioritiesfor,academicstaff,includingthoseengagedinthedevelopmentanddeliveryofteachingdevelopmentprogrammes,mayhaveemphasisedengagementinotherareasofscholarlyactivity.Untilrecentlythelimitedrangeofspecialistjournalslinkedtoacademicdevelopment,orlimitedprioritisationofthesubjectinwiderHEjournals,mayalsohaveconstrainedpublicationopportunities–indirectlydivertingresearchattentiontoareasmorelikelytosecurepublication–andindividualanddepartmentalvalueintheResearchAssessmentExercise(RAE)andnowtheResearchExcellenceFramework(REF).

• Programme differentiation: Insomecases,evaluativeevidencedoesnotdistinguishbetweentheeffectsofteacherdevelopmentandwiderQAimprovementinitiatives.Thisreviewhasshownthatwhereteacherdevelopmentactivitiesareembeddedinwiderqualityorimprovementprogrammesforthesector,orpartsofit,impactevidenceofsuchinitiativescanbelostinwiderevaluations.

Insum,itseemslikelythat(lackof)availabilityofresearchfundingorappropriateresourcesforsystematicevaluation,willhaveactedtoholdbacktheconductofpublishablequalityresearch,orthescaleofthatconducted.Atnationallevelthisissurprisinggiventheimportancethatpolicymakershaveattachedtoevidence-basedpolicyandassessingthecost-effectivenessandvalueofpubliclyfundedinvestments.Atinstitutionallevel,itmayquestiontheextenttowhichdecision-makersappreciatethevalueofsystematicformativeandsummativeevaluationtoimprovingteachingdevelopment(andother)practice.

35

Section 5: Next steps

5.1 Introduction

ThisfinalsectionofthereviewdrawstogethertheevidencesetoutinSections2to4ofthisreport.TomeettheHEA’sinformationneedsthesearesetoutasfollows:

• anoverviewofevidenceofimpactsfromteacherdevelopmentprogrammes;• thelimitationsofunderstandinginrelationtoevidenceneedsandgaps.

AnassessmentisalsosetoutofproposednextstepstosupportaseriesofrecommendationsprovidedtotheHEA.ThismayaidtheHEAinusingandtakingforwardthisreview.

5.2 Evidence of impacts

Theavailableevidenceontheimpactofteachingdevelopmentprogrammes,intheirvariousguises,ispositive.Thisismostnotablyforchangestoteacherattitudesandconceptions,althoughwithamoreconfusedpictureregardingthetransferofthisknowledgetopractice.Notallpastresearchershavecometothesameconclusion,andsomearehesitantontheevidencebasetodoso.However,thisassessmentdoesseemtoreinforceamajorityviewamongthoseengagedin(published)scholarlyinquiry(Trigwell,2012).

Thesepositiveimpactsmaygowiderthanresidualeffectsforindividualparticipants.Someresearchershavegonefurthertosuggestthatthisinturniscreatinggreaterconfidenceinthesectorregardingtheutilityofsuchinterventions.Steinertet al.(2006),forexample,suggestedthisisreflectedinmorepositiveattitudestofacultystaffdevelopmentitself.

Particularissuesthatemergefromtheresearch,althoughnotalwaysconsistently,are:

• Thereisapositiveassociationbetweenparticipationinteacherdevelopmentprogrammesandindividuals’propensitytodevelop(orenhance)learner-centredteachingmethods.Thisisimportantsincearangeofwiderscholarlyandpedagogic-centredresearchstudieshaveshownsuchmethodsareinturnassociatedwithstrongerstudentoutcomesinHE.

• Impactsonteacherknowledgeandskillsarelessclearbutseemtobepositivelyaffectedbyacombinationoflongerdurationprogrammes,integratedsupport(especiallyfornewerteachers)andcontinuedformalinputsfromcontinuingprofessionaldevelopment.

• Impactsmaybemorereadilyachievedforestablishedteachersbuttheavailableevidencesuggeststhereissubstantialpotentialfortransfertopracticeamong‘novice’oraspiringteacherswhereacriticalmassofpedagogicknowledgeisachieved.

Theseareimportantpositiveeffects,butthevalueoftheevidencewehavefoundisheldbackbysomeseriousevidencegaps,returnedtobelow,andbyalackofcomparability.Thisreflects,inpart,thefragmentationofapproacheswithdifferent

36

researchlookingatprogrammesthatmayhaveradicallydifferentaimsanddifferentprogrammecontexts.Evenwithinprogrammes,thesewillbedeliveredthroughdifferentmodalities,makingdirectcontrastsofeffectsverydifficult.Theresearchitselfwillalsobeshapedtomeetspecificneedsandcontextsand,assuch,willbeusingdifferentindicatorsormeasuresofperformance,differentevidencecollectionmethodsorcombinations,andwithoftenverydifferentfocusesandscaleofevidencecollected.

Puttogether,whilethisprovidesforarichanddiverseevidencebase,itmakesdrawingcommonlessonsproblematic.Wheresomelessonsdoemerge,theyreflectthefocusofmuchoftheresearchonscholarlyinquiryandarenotnecessarilygearedtotheinterestsofpolicymakers,inparticularthoseconcernedwithevidence-basedpolicy.Keypolicyresearchissuessuchasunderstandingdeterminantsofprogrammeeffectivenessandimpactsuccesses,aswellastheadditionalityofrealisedbenefits,arealmostabsentfromthepublishedresearch.Thisisnottocriticisetheresearchers,butitdoesquestionwhyapolicyinterventionthathassecuredlarge-scalepublicinvestments,especiallyinthelasttento15years,hasseensolittleattentionpaidtotheevaluationoftheimpactsandonfactorsthatwillenhancefuturedecision-makingaboutprogrammefundingandfocus.

5.3 Evidence needs and gaps

Thisreviewstartedwiththepremisethatresearchonprogrammeswithanimpactfocuswasgoingtobedifficulttoisolateandlimitedinscope.Infact,theauthorshavebeenimpressedwiththerangeofwhatdoesexist–albeitwithsomechallengesinitsdiversityandfragmentation.

Theprevioussectionofthereporthasalsodocumentedsomeoftheapparentweaknesses–andchallenges–emergingfromtheevidencebase.Whiletherehasbeenanotablewideningoftherangeofstudiesavailableinthelasttento15years,thesubstanceandutilityofmuchofthisseemspoorlyplacedforinformingpolicydevelopmentrelatingtoteacherdevelopment–intheUKandelsewhere.

Lookingacrossdifferentcountries’experiences–andevidence–thisdoesnotseemtobeanovelassessment.Severalcross-cuttingassessments,lookingacrosstheresearchbase,pointtothescarcityofwell-designedstudies,variouslycallingfortheneedformoreandbetterdesignedresearch,andevaluationtools,ontheimpactofstaffdevelopment(WeimerandLenze,1998;KreberandBrook,2001;Steinertet al.,2006;Steset al.,2010a;Trigwell,2012).Whereproposalsaremadeforstrongermethodologicalfoundationsthesetendtoemphasisetheimportanceofmorequalitativeevidencetosupportthecommonrelianceonself-assessmentbyteachers(Steinertet al.,2006)ormixedmethodstudies(Levinson-RoseandMenges,1981;WeimerandLenze,1998;Steset al.,2010a)inordertogaininsightintotheprocessofknowledgetransfertopracticeandgeneratingteacherandstudentoutcomes.

Beyondresearchers’ownreflectionsthisreviewhasbeenabletodrawlimitedevidenceoneffectivenessofmethodsfrompublishedstudies,inpartbecauseauthorsoftenprovidepartialorlimiteddetailsonmethodologyusedtoassessimpact.Thisisparticularlythecasewherethereareincompleteorinsufficientdetailsonscaleandscopeofevidencecollection.However,thisreviewsuggeststhatissuesemergingforimprovedmethodsforcharacterisingandbetterunderstandingofimpactinclude:

• Pedagogical clarity: Teachingdevelopmentactivitiesarediverse–withdifferentemphases,modalitiesandcontent.Researchersoftenprovidelimitedevidenceofimportantissuesofcontextinwhatisbeingevaluatedforimpact,andnotablyonissuessuchasthepedagogicalapproachestotheunderpinningfocusofthestudiesbeingcarriedout.Thereasonsforthisareunclear,butthisrepresentsaseriouslimitationtosharingknowledgefromsuchstudies,andcriticallyappraisingtheimpactmessages.

37

• Use of proxy measurement: Whereresearchershavesoughttoreviewimpact,usuallyaspartofwiderevaluativefocus,thesestudiesoftenrelyonwhatamountstoindirectorproxymeasures(e.g.academics’self-assessmentofenhancedstudentlearning),whichhavebeencriticisedinsomeoftheliteraturefortheirusefulness(Levinson-RoseandMenges,1981).Self-assessmentcanlackreliability,especiallywhendrawnfromcustomiseddeterminingvariables.Therewouldseemtobemuchgreaterpotentialforeitherharnessingexistingtoolsandframeworks(e.g.ATI,MEQforbenchmarkingteacherattitudesandpractices,andSPQ,ASI,SEEQorothersforassessingstudentoutcomes)andalsoformappingpost-programmechanges.Thesealsoofferopportunitiesforcustomisingcontentorfocusandforusewithcontrolgroups,whileretainingthevalueofbuildingcomparativeevidenceforusebyotherresearchers.

• Breadth of impact measurement: Otherresearchershaveapproachesthatmaycentreonsingle-sourceddirectmeasuresincluding,forexample,verybroadlybasedmethodssuchaslikertscalesofstudentsatisfactionwithteaching.Focusingevidencegatheringmayhavethevalueofintensifyingtheeffort,securingearlierresultsandreducingcosts,butworksagainstadeeper,orcontextualised,analysisandunderstandingofimpactandhaslimitedvalueforempiricalreviewofimpact.

• Understanding impacts on students: Todatemuchoftheavailableimpactevidencefocusesontheeffectsonteachers’attitudesandaspectsofpractice.Asnoted,thereisratherlessattentionpaidtotheimpactsobservedfromstudentexperiences,eitherthroughdirectfeedbackorobservation,althoughthereissomediscussionofthevalueofnon-causalevidencefromstudentfeedback.Someobservershavechronicleddifficultiesinsecuringwiderstudentfeedback,yetthisseemsatoddswiththedemonstrablyeffectiveapproachesfromresearcherssuchasProfessorTrigwellandcolleagues,andGibbsandCoffeyamongothers.However,Trigwellhasrecentlycautionedthatnoappropriatetoolsfullyreflecttheneedforagenericapproachtomeasurementandnoted:

A better proxy for change in student learning at Guskey’s (2000) level five would therefore be a teaching engagement questionnaire that includes those additional aspects of teaching that development programmes are aiming to achieve, such as communication and scholarship, as well as an understanding of subject matter. At the present time no such questionnaire appears to exist.  (2012,pp.259-260)

Publicpolicyalsoseemssettoplaceanemphasisontheoutcomesforend-users–thestudentsthemselves–andthiscannolongerbeneglectedinfavourofthemorereadilysecuredevidenceonteacherperceptions.

• Use of longitudinal studies:Ifstudentoutcomesevidenceislesseasytogenerateandtocomeby,longitudinalstudiesseemevenmorelimited.Getzelet al.(2003)andSowersandSmith(2003)havebeenamongthefewwhorecognisethisasanevidencegapandcalledforinstitutionalresponsestolookatprogressionand/orsustainabilityofimpact.Longitudinalanalysisseemstheareaofresearchinvestmentmostlikelytobenegativelyaffectedbyfundingand/ortimeconstraintsonresearchers.Suchevidenceisalmostwhollysmall-scale.However,withtheevidencethatisavailablesuggestingthatimpactsonteachingqualitymayneedacriticalmassofpedagogicappreciation,maybeprogressiveoreven‘slowburn’,itwouldseemthatimpactneedstobeviewednotfromtheperspectiveofchangesshortlyafterparticipation,butinthemedium-term–ayearormoreafterparticipantshadconcluded.Thisisnotavaguecalltoarms,andspecificmethodsareofferedfromsomeexistingresearch.Delayedtestshavebeenusedsuccessfullybysome(Hoet al.,2001),albeitlargelyataquasi-experimentalscale.Inventorymethodshavealsobeenproposedbyrecentresearchersinthefield(TrigwellandProsser,2004)andwouldseemtohavescopeformuchwideruseamongresearchersiftheirapplicationissufficientlywellunderstood.

38

• Use of tracking studies: Aseparatebutrelatedevidencegapseemstobethelackoftrackingorlarger-scalestudies.Thesemayprovideforcontinuinganalysisofchangeonacross-institutionalbasis,butalsoprovidebenchmarksforinstitutionlevelevaluation–asproposedbyGibbs(2010),Hardy(2008),andHoet al.(2011).Thismightalsoprovideanempiricalbasisforassessingtherelationshipbetweenespousedbeliefsandteachingpractice(Kaneet al.,2002)wheretheevidenceseemstoolimitedtohaveenabledsufficientlythoroughinvestigationtodrawanydefinitiveconclusions.Suchafocus,howeverconstituted,mightalsoprovideasuitableplatformforaddressingsomeoftheuncertainissuesorcontradictoryfindingsemergingfromthepatchworkofevidenceavailable,asoutlinedabove.

• Use of control groups: Currentmethodsfocusonprogrammeparticipantsasdirectbeneficiaries,andtheuseofprovenimpactassessmentmethodsbasedonevidencetriangulationseemsmorelimited,inparticulartheuseofcontrolgroups.Suchevidenceisvitalinestablishingthemeasureofchangeandoutcomesfromprogrammeinvestmentsandalsotheextentofknowledge/learningtransfertopractice.Controlgroupmethodsarealsoareliablebasisforestablishingtheadditionalityoftheeffect(i.e.fromthecounter-factualpositionofwhateffectswouldotherwisehaveoccurredwithouttheinvestment).ItseemslikelythatintheUKandelsewhere,thosefundingsuchinvestment,atinstitutionlevelandabove,willbeplacinggreateremphasisnotonbenefitrealisationfromprogrammesbutonevidenceofitsadditionality,andfewstudiesarewellplacedyettoaddressthisneed.

Afinalobservationonthestrengthsandweaknessesoftheevidencebasegoesbeyondanalysingprogrammeeffects,tolookingatcausesandcausality.Thisreviewhassuggestedthattheavailableevidenceisnotablyweakondeterminantsofprogrammeimpactandeffectiveness.Ingeneral,studiesseemstrongeronmeasuringrealisedbenefits–forteachingstaffandtoalesserextentforstudents–butwithtoolittlefocusonthehowandwhysuchbenefitscomeabout,ortoexplainwhysuchprogrammesarehavinganeffect.

Acrucialareafor‘determinants’evidenceseemstobemoreinformationonwhatunderpinsknowledge/learningtransferfromprogrammes.Inthis,DeRijdtandcolleagues(2012)havecomparedfindingson‘influencingvariables’withconclusionsfromselectedreviewsof‘trainingtransfer’inthefieldofmanagement,HRDandorganisationalpsychology(BaldwinandFord,1988;Blumeet al.,2010;BurkeandHutchins,2007).Theyproposeafocusforfutureresearchoninfluencingvariablesandinparticular:

• motivationtolearn;• motivationtotransfer;• needsanalysis;• activelearning;• self-managementstrategies;• strategiclink;• transferclimate;• supervisorysupport.

TheyfurthersuggestanumberofmoderatingvariablesthatcouldbeofimportancewithinthecontextofstaffdevelopmentinHEthatneedfurtherresearch.Theseare:

• timelagversusnotimelag;• self-measureversusothermeasureoftransfer;• usemeasureversuseffectivenessmeasureoftransfer;• openskillversusclosedskill.

Thisseemsausefulstartingpointtounderstandingdeterminantsandwouldprovideforgreatercomparability,butthefocuscouldgowidertolookatothercontextualfactors.Whateverfocusistaken,thisremainsacriticalevidencegap,since,asTrigwell

39

(2012)hasrecentlyargued,onlybyasking‘why’willthosedesigninganddeliveringprogrammesbeabletoimprovetheireffectivenessandfurtherraiseimpactsfromwhatisbeingprovided.

TotacklethisTrigwellproposesafundamentalshiftinthefocusofresearcherstoemphasiseunderstanding:

… relations between the context, mechanism and outcomes. For teaching development programmes this means finding out what actions lead to what outcomes for what people. (2012,p.263)

Othershavesuggesteddifferentmodelsfornewevaluationmethodsthatcancaptureappropriateimpactevidence(VanNoteChismandSzabó,1997;Guskey,2000;KreberandBrook,2001).Trigwell(2012)buildsonthistosuggestthatastrongeremphasison‘determinants’ofimpactcouldbeaddressedbyadoptinganemphasisonmethodsofRealisticEvaluation(PawsonandTilley,1997).

Thisreviewisnotwellplacedtochallengeorendorsesuchproposals,althoughtheauthorswouldencouragechoiceandcustomisationofstructuredmethodsthatareconsistentwithcost-effectivedeliveryincludingforlarger-scaleandlongitudinalsamples.However,toTrigwell’scallforastrongerfocusondeterminantsofimpact,wewouldaddtheneedforinitiativesinthedesignanduseoftools,andevaluationframeworks,whichcouldencouragemorecomparabilityoffindingsfromindividualstudies,whilealsosystematicallyaddressingsomeoftheseevidencegaps.Thesewouldbeimportantstepstowardsprovidinganevidencebaseforimprovingprogrammepractice–whichiscurrentlybuiltonshallowandfragmentedfoundations.

5.4 Tackling the challenges

TheHEAcommissionedthisreviewwiththeintentionofharnessingavailableevidenceonprogrammeimpactstoinformitsfuturedecision-making.Thereviewshowsagrowinganddiverseevidencebase,withsomepositiveindicationsofimpactsfromprogrammes,butwith(asyet)afragmentedevidencebasetodrawontoinformfuturepolicyandimprovements.MuchofthisevidenceisalsodrawnfromoutsidetheUK.WhilethisisvaluabletoinformingUK-basedpractice,itraiseslargerquestionsaroundtheresearchfunding,orientationand/orinstitutionalcapacityinallUKhomecountriestoundertakesystematicimpactassessmentofteachingdevelopmentprogrammes.

Theseissues,andmorespecificsuggestionsforafutureresearchfocus,havebeendrawntogetherhereinaseriesofinterrelatedrecommendationsfortheattentionofallagencieswithaninterestinhowmoreeffectiveteachingdevelopmenthasbeen(andcanbe)contributingtowiderpublicpolicy,andinstitutional,goalsforHE.Weregardalloftheseproposalsasevidence-basedandimportant.Wealsoindicatethoseactionswewouldregardasurgentpriorities.

Recommendation 1 – Conduct a sector consultation on this review: Thisreviewhasbeencarriedoutrobustlyusingsystemic(literature)reviewmethodsandconsultingsomeofthekeyplayersastoanyemergingevidencegaps.WecommendittotheHEAasastateoftheartreviewofcurrentknowledge.However,weareconsciousthatwehavehadverylittlefeedbackfromcallsforevidencefromthesectorandrecommendthatitis‘stresstested’bytheHEAthroughanopenconsultationseekingviewsonthereport,andacallforfurtherevidence.Thisisanurgentpriority.

Recommendation 2 – Develop a cross-national review of wider evidence: MuchoftheevidencewehavedrawnonisfromoutsidetheUK.InparallelwithRecommendation1,wewouldsuggest:

40

2aTheHEA,orothers,shouldfollowupthisreviewwithselectedcross-nationalandinternationalbodiesandforatoalsoseektheirviewsonthereport,andtoidentifyanysynergieswithpossiblefutureactionsbythosebodies.

2bTheneedforafocusedstudytoreviewdevelopingnationalevidenceinselectedcountrieswherethereareneworsignificantteacherdevelopmentinitiativesorarrangementsfornetworkingexperienceinHEtoassessanylessonsforhowothershave,orare,developingarrangementsforassessingeffectivenessandeffectsoftheseinvestments.WeproposethereisparticularvalueinfocusingthisstudyonfourtofiveselectedcountriesinEurope.Thisstudywillprovideastartingpointforreviewingthisfocus,andtheconsultationproposedinRecommendation1andRecommendation2a)willalsocontributetoselectionofcountriestobereviewed.

Recommendation 3 – Prioritise the development of an impact assessment guide/toolkit for HE: WeareconsciousthatlocalisedresearchandalsoevaluationsofinstitutionalandpartnershipinitiativesfocusedinteachingdevelopmentprogrammeswillcontinueintheUK.Theseneedtobeurgentlyinformedofthelessonsfromthisresearch,andweproposethatthereisaneedfortheproductionofagenericimpactassessmentguideortoolkitthatcandrawontheselessonsto:

• unpicksomeofthesuggestedmethodologicalrequirementsandcontextstohelpprogrammemanagers/institutionalfunderstosetevaluationframeworks;

• setoutevidence-basedprosandconsofapproachestoaidlocalreview;

• mapcontext-specificpracticalresolutionstosomeofthemethodologicalchallengesidentifiedhere;

• provideguidedlinkstotoolsandkeypapers.

Thisshouldbeanurgentpriority,butanysuchguideislikelytoneedtobeupdatedtotakeaccountofoutcomesfromRecommendations6and7.

Recommendation 4 – Establish a national focus for further research to improve impact evidence: Thereviewhasdocumentedthestrengthsofavailableresearchbutalsomanyweaknesses.Thesewouldseemtobeapotentialfocusforasmall-scaleandhighlytargetedcompetitivelyallocatedresearch-fundingexerciseconductedunilaterallybytheHEA,orothers,toinformbetterpractice,orperhapsincollaborationwithfundingand/orresearchcouncils.Suchaneffortmightexploredifferentfundingchannels(competitivebidsforresearchgrants,targetedresearchstudentships,knowledgeexchangefora,and/orfundedshort-termresearchfellowships)withthejointaimoffillingcriticalinformationgapsandtohelpbuildUKresearchcapacityinthisarea.SomeoftheissuesforathemedfocusforsuchfundingaresetoutinSection4.4above,butothermayemergefromtheconsultationproposedinRecommendation1above.Tothismightbeaddedfundingforastudyorreviewoftherelativeimpacteffectsofdisciplinary-focusedprogrammes–buildingonthisreviewbutlookingtoaddprimaryevidencetofillthisevidentresearchgap.

Recommendation 5 – Establish a national focus for further research to improve policy formation: TheHEA,orothers,shouldseektoestablishtwofullyfundedresearchinvestmentsintheUKtoextendtheevidencebaseincriticalareasforpolicyformationandinstitutionalimprovementandspecificallyforestablishing:

5aoneormorecross-institutionalimpact-centredlongitudinalstudiesofprogrammeparticipationandimpactintheUKtoreportovera24-30monthperiodonmedium-termeffectsforteacherandstudents,longer-termpotential,impactdeterminantsandutilityofshorter-termandlonger-termimpactmeasures,andtoencouragefundingbodiestoestablishthesustainabilityofthoseinquiriesthroughtrackingevidencetoestablishlonger-termimpactsforteachersandinstitutions;

41

5boneormoreparallelstudiesontransfermechanismsforprogrammeinvestments,andinparticularprovidinganempiricalbasisofthecallbyDeRijdtandcolleagues(2012)foranalysisofinfluencingvariablesandmotivators.Thesemaybestprioritisetrackingbackstudiestocapitaliseonexistinginstitutionalprogrammes,andtoproduceearlyevidence,andwouldbeexpectedtoreportalsowithin18months.Thereisvaluealsoincommissioningoneormoretrackingforwardstudies,butthesecouldnotbeexpectedtoproduceviableevidenceinlessthan24-30months;

5coneormoreresearchinvestmentsthatdevelopandharnesscontrolorcomparisongroupmethodsalliedtoexistingteacherdevelopmentinvestmentsintheUK,andtocriticallyappraisetheaddedvalueandadditionalityofteachingdevelopmentprogrammes.Theseshouldbeabletoreportwith18months.

Eachoftheseinvestmentswilltakesometimetosetupandconclude,andweregardcommissioningtheseasan urgent priority.

Recommendation 6 – Establish a study to map use of tools and evaluative approaches: Thisreviewhasonlypartiallycompleteditsgoalofcriticallyappraisingresearchmethodsandtools,duetotheinaccessibilityofmuchofthisevidencefrompublishedresearch.WeproposetheHEA,orothers,shouldestablishastudytoextendthisanalysisthroughprimaryresearch-basedreviewwithselectedresearchersdrawnfromthebibliographyinthisstudy,tomapandcriticallyappraisetheuse,andutility,oftools,andimplicationsforfurtherdevelopmentofcommontoolsandapproachesinimpactassessment.Thisshouldbeanurgent priority,whichmightbaseitsapproachonthereviewof37studiesinternationallyconductedbyStesandcolleagues(Steset al.,2012).

Recommendation 7 – Establish a national development project to produce ‘generic’ impact assessment tools and instruments: Thereisaneedtoprovidemethodologicaldevelopmentfundingforoneormorestudiescentredontheproductionandtestingofgenerictools,andcustomisableresearchinstrumentstosupportprogrammeimpactassessment.Inparticular,thereisaneedfortheHEA,orothers,tosupport:

7aaretrospectivereviewofexistingteacher-impacttools(ATI,MEQ,andothers);

7baretrospectivereviewofexistingstudentexperiencetestingtools(e.g.SEEQ,ATI,MEQ);

7cadevelopmentalstudytoproduceandtestagenericstudentexperienceassessmenttoolcapableofcustomisedusedindifferentteachingdevelopmentcontextsinHE(e.g.Trigwell’sproposal,2012);

7dadevelopmentalstudytomapandassesslessonsintheuseandapplicationofalternativeimpactmeasuresincludingthosedrawnfromSocialReturnonInvestment(SRoI)andothersocialimpactmeasures.

Eachofthesewouldbeexpectedtoreportin12months.Weregardeachofthesetobeanurgent priority.

Recommendation 8 – Establish a national repository of research-related impact evidence: Theevidencewehavedrawnonishighlyfragmentedandofteninaccessible.WeproposethattheHEA,orothers,shouldreviewthescopewithincopyrightconventionstoestablishanonlinerepositorytowidenaccesstothematerialdrawnattentiontointhisreview,andtoothersemergingfromRecommendations1and2(andothers)above.

Recommendation 9 – Establish a cross-sector ‘benchmarking’ study: ThereviewpresentedherehasnecessarilyfocusedonHEexperiencesofteacherdevelopment.

42

However,theutilityoftheevidencegathered,andanyfutureevidenceproducedfromtheinvestmentssuggestedhere,wouldbesubstantiallyenhancedbywiderbaselineevidenceoftheimpactofteacherdevelopmentprogrammesfromoutsideHE.WeproposethereismuchthatcanbedrawnintheUKfromschool-basedteacherimprovement,andspecificallyfromotherpost-secondaryteacherdevelopmentincludingwithinfurthereducation.WeproposeasinglebenchmarkingstudyissetupbytheHEA,orothers,toidentifyandreviewsuchevidence,andtheimplications(againstthisstudy)forHE,andtoseekthisstudytoreportin12months.Thisisanurgent prioritytobeabletobetterunderstandthesignificanceofthefindingsfromthisreview.

WhiletheserecommendationsarepresentedheretotheHEA,asthearchitectsofthisreview,weanticipatethatactionagainstthesemayrequiremulti-agencyinputs,andtheHEAmaynotbebestplacedtoleadaresponseoneach.

Wecommendthisreview,andtheserecommendationstotheHEA,andthanktheagencyfortheforesightinestablishingwhatwehopeotherswillalsoregardasanimportantandtimelysynthesisofthestateofcurrentknowledge.

43

Annex A: Research issues and questions

a Thequalityandrobustnessofexistingevidenceabouttheefficacyofteachingdevelopmentprogrammes.

b Thecostsandbenefitsofincludingdisciplinarycomponentsinteachingdevelopmentprogrammes,andonfocusingteachingdevelopmentprogrammesaroundadisciplinarycontext.

c Goodpracticeinthedevelopmentofbothnationalandsub-nationalbutinter-institutionalteachingdevelopmentframeworksthatareusedtostructureanddesignteachingdevelopmentprogrammes.

d Exampleswherethedesignofteachingdevelopmentprogrammesandframeworkshasbeenbasedonconceptualschemesandmodelsofteachingandlearning.

e Theimpactontheefficacyofteachingdevelopmentprogrammesofthecompulsoryorvoluntarynatureofthoseprogrammes.

f Thedifferentwaysinwhichteachingdevelopmentprogrammescanbe,andhavebeen,evaluated,andtherelativemeritsofthosedifferentevaluationmethods,lookinginparticularatmethodsofevaluatingtheimpactofteachingdevelopmentprogrammesonstudentlearning.

g Theimpactofteachingdevelopmentprogrammesonwiderteachingcultures,e.g.thestatusofteaching,andteachers’involvementinthescholarshipofteachingandlearning.

h Whatobstaclesandchallengesexisttoinvestigatingtheefficacyandimpactofteachingdevelopmentprogrammes.

i Howthegoalsofteachingdevelopmentprogrammeshavebeenconceptualised,andhowthebalancehasbeenstruckbetweenaimingatchangestoteachers’attitudesandpractices,andchangestostudents’attitudesandlearningexperiences.

44

Annex B: Bibliography

Akerlind,G.S.(2003)Growinganddevelopingasauniversityteacher–Variationinmeaning.Studies in Higher Education.28(4),375-390.

Baldwin,T.T.andFord,J.K.(1988)Transferoftraining:Areviewofdirectionsforfutureresearch.Personal Psychology.41,63-105

Baume,D.(2006)TowardstheendoftheLastNon-Professions.International Journal for Academic Development.11(1),57-60.

Béchard,J.-P.(2001)L’enseignementsupérieuretlesinnovationspédagogiques:unrecensiondesécrits.Revue des sciences de l’éducation.27(2),257-281.

Biggs,J.(1987)Study Process Questionnaire Manual.Melbourne:AustralianCouncilforEducationalResearch.

Blume,B.D.,Ford,J.K.,Baldwin,T.T.,andHuang,J.L.(2010)Transferoftraining:Ametaanalyticreview.Journal of Management 36(4),1065-1105.

BolognaDeclaration(1999)Joint declaration of the European Ministers of Education: TheEuropean Higher Education Area.Availablefrom:http://www.bologna-berlin2003.de/pdf/bologna_declaration.pdf[12February2007].

Bowden,J.A.(1989)Curriculum Development for Conceptual Change Learning: A Phenomenographic Pedagogy.PaperpresentedattheSixthAnnual(International)ConferenceoftheHongKongEducationalResearchAssociation,HongKong.

Brawner,C.E.,Felder,R.M.,Allen,R.andBrent,R.(2002)Asurveyoffacultyteachingpracticesandinvolvementinfacultydevelopmentactivities.Journal of Engineering Education.91(4),393-396.

Brew,A.andGinns,P.(2008)Therelationshipbetweenengagementinthescholarshipofteachingandlearningandstudents’courseexperience.Assessment and Evaluation in Higher Education.33(5),535-454.

Burke,J.C.andMinassians,H.(2001) Linking state resources to campus results: From fad to trend. The fifth annual survey.Albany,NY:TheNelsonARockefellerInstituteofGovernment.

Burke,L.A.andHutchins,H.M.(2007)Trainingtransfer:Anintegrativeliteraturereview.Human Resource DevelopmentReview.6,263-296

Butcher,J.andStoncel,D.(2012)TheimpactofaPostgraduateCertificateinTeachinginHigherEducationonuniversitylecturersappointedfortheirprofessionalexpertiseatateaching-leduniversity:‘It’smademebraver’.The International Journal for Academic Development.17(2),149-162.

45

Chalmers,D.(2007)A review of Australian and international quality systems and indicators of learning and teaching.Sydney,NSW:CarrickInstituteforLearningandTeachinginHigherEducationLtd.Availablefrom:http://www.catl.uwa.edu.au/projects/tqi[lastaccessedJuly2012].

Chalmers,D.(2008)Indicators of university teaching and learning quality.Sydney,NSW:AustralianLearningandTeachingCouncil.Availablefrom:http://www.catl.uwa.edu.au/projects/tqi[lastaccessedJuly2012].

Chalmers,D.(2010)National Teaching Quality Indicators Project - Final Report. Rewarding and recognising quality teaching in higher education through systematic implementation of indicators and metrics on teaching and teacher effectiveness.Sydney,NSW:AustralianLearningandTeachingCouncil.Availablefrom:http://www.catl.uwa.edu.au/projects/tqi[lastaccessedJuly2012].

Chalmers,D.,Lee,K.andWalker,B.(2008)International and national indicators and outcomes of quality teaching and learning currently in use. Sydney,NSW:AustralianDepartmentofEducationEmploymentandWorkplaceRelations.Availablefrom:http://www.uwa.edu.au/__data/assets/pdf_file/0008/1891664/International_and_national_indicators_and_outcomes_of_quality_teaching_and_learning.pdf[lastaccessedJuly2012].

Coffey,M.andGibbs,G.(2000) TheEvaluationoftheStudentEvaluationofEducationalQualityQuestionnaire(SEEQ)inUKHigherEducation. Assessment and Evaluation in Higher Education.26(1),89-93.

D’AndreaV.-M.andGosling,D.(2005)Improving Teaching and Learning in Higher Education: a whole institution approach.London:McGrawHill.

DeRijdt,C.,Stes,A.,vanderVleuten,C.andDochy,F.(2012)Influencingvariablesandmoderatorsoftransferoflearningtotheworkplacewithintheareaofstaffdevelopmentinhighereducation:Researchreview.Educational Research Review.Availablefrom:http://dx.doi.org/10.1016/j.edurev.2012.05.007[lastaccessedSeptember2012].

DepartmentforBusiness,InnovationandSkills(2011)Higher Education. Students at the Heart of the System.Availablefrom:http://www.bis.gov.uk/assets/biscore/higher-education/docs/h/11-944-higher-education-students-at-heart-of-system.pdf[lastaccessedAugust2012].

DepartmentforBusiness,InnovationandSkills(2012)Higher Education. Students at the Heart of the System. Government Response.Availablefrom:http://www.bis.gov.uk/assets/biscore/higher-education/docs/g/12-890-government-response-students-and-regulatory-framework-higher-education[lastaccessedAugust2012].DepartmentforEducationandSkills(2003)The Future of Higher Education.Availablefrom:http://www.bis.gov.uk/assets/biscore/corporate/migratedd/publications/f/future_of_he.pdf[lastaccessedJune2012].

Desimone,L.M.(2009)ImprovingImpactStudiesofTeachers’ProfessionalDevelopment:TowardBetterConceptualizationsandMeasures.Educational Researcher. 38(3),181-199.

Dixon,K.andScott,S.(2003)Theevaluationofanoff-shoreprofessional-developmentprogrammeaspartoftheuniversity’sstrategicplan:Acasestudyapproach.Quality in Higher Education.25(2),63-80.

46

Donnelly,R.(2007)PerceivedImpactofPeerObservationofTeachinginHigherEducation.International Journal of Teaching and Learning in Higher Education. 19(2),117-129.

Eble,K.E.andMcKeachie,W.J.(1985)Improving Undergraduate Education Through Faculty Development.SanFrancisco:Jossey-Bass.

Entwistle,NJandTait,H.(1994),The Revised Approach to Studying Inventory.CentreforResearchintoLearningandInstruction.UniversityofEdinburgh.

Fitzpatrick,R.(2001)Thestrangecaseofthetransferoftrainingestimate.Industrial-Organizational Psychologist.39(2),18-19.

Flecknoe,M.(2002)MeasuringtheImpactofTeacherProfessionalDevelopment:CanItBeDone?European Journal of Teacher Education.25(2-3),119-134.

Gaff,J.(1975)Towards Faculty Renewal.SanFrancisco:Jossey-Bass.

Gaff,J.G.andSimpson,R.D.(1994)FacultydevelopmentintheUnitedStates.Innovative Higher Education.18(3),167-176.

Getzel,E.E.,Briel,L.W.andMcManus,S.(2003)StrategiesforImplementingProfessionalDevelopmentActivitiesonCollegeCampuses:FindingsfromtheOPE-FundedProjectSites(1999-2002).Journal of Postsecondary Education and Disability.17(1),59-76.

Gibbs,G.(1995a)Changinglecturers’conceptionsofteachingandlearningthroughactionresearch.In:Brew,A.(ed.)Directions in Staff Development.Buckingham:SRHEandOpenUniversityPress.

Gibbs,G.(1995b)Howcanpromotingexcellentteacherspromoteexcellentteaching?Innovations in Education and Training International.32(1),74-84.

Gibbs,G.(2010)Dimensions of quality.York:HigherEducationAcademy.Availablefrom:http://www.heacademy.ac.uk/assets/EvidenceNet/Summaries/doq_summary.pdf[lastaccessedOctober2012].

Gibbs,G.(2012)Reflections on the changing nature of educational development.Inpress.

Gibbs,G.andCoffey,M.(2004)Theimpactoftrainingofuniversityteachersontheirteachingskills,theirapproachtoteachingandtheapproachtolearningoftheirstudents.Active Learning in Higher Education.5(1),87-100.

Gibbs,G.,Habeshaw,T.andYorke,M.(2000)InstitutionalLearningandTeachingStrategiesinEnglishHigherEducation.Higher Education.40(3),351-372.

Gilbert,A.andGibbs,G.(1999)Aproposalforaninternationalcollaborativeresearchprogrammetoidentifytheimpactofinitialtrainingonuniversityteachers.Research and Development in Higher Education.21,131-143.

Gow,L.andKember,D.(1993)Conceptionsofteachingandtheirrelationshiptostudentlearning.British Journal of Educational Psychology. 63(1),20-33.

Guskey,T.R.(1986)Staffdevelopmentandtheprocessofteacherchange.Educational Researcher. (May),5-12.

Guskey,T.R.(2000)Evaluating Professional Development.ThousandOaks,CA:CorwinPress.

Haigh,N.(2012)SustainingandspreadingthepositiveoutcomesofSoTLprojects:issues,insightsandstrategies.The International Journal for Academic Development.17(1),19-31.

47

Hanbury,A.,Prosser,M.andRickinson,M.(2008)ThedifferentialimpactofUKaccreditedteachingdevelopmentprogrammesonacademicsapproachestoteaching.Studies in Higher Education.33(4),449-483.

Hardy,I.(2008)TheImpactofPolicyuponPractice:AnAustralianStudyofTeachers’ProfessionalDevelopment.Teacher Development.12(2),103-114.

Healey,M.(2000)Developingthescholarshipofteachinginhighereducation:Adisciplinebasedapproach.Higher Education Research and Development,19(2),169-189.

HEA/QAA(2009)Quality Enhancement and Assurance in Wales: A Changing Picture.York:HigherEducationAcademyandQualityAssuranceAgency.

HEFCE(2001)Strategies for Learning and Teaching in Higher Education.Bristol.HEFCECircular01/37

HEFCE(2003)HEFCE strategic plan 2003-08.Availablefrom:http://webarchive.nationalarchives.gov.uk/20100202100434/http:/www.hefce.ac.uk/pubs/hefce/2003/03_35.htm[lastaccessedJune2012].

Hewson,M.G.,Copeland,H.L.andFishleder,A.J.(2001)What’stheuseoffacultydevelopment:Programmeevaluationusingretrospectiveself-assessmentsandindependentperformanceratings.Teaching and Learning in Medicine.13(3),153-160.

Heutte,J.,Lameul,G.andBertrand,C.(2010)Dispositifs de formation et d’accompagnement des enseignants du supérieur; point de situation et perspectives francaises concernant le développement de la pédagogie universitaire numérique.PaperpresentedonbehalfoftheMinistèredel’enseignementsuperieuretdelarecherchéatTICE2010–7èmeColloqueTechnologiesdel’InformationetdelaCommunicationpourl’Enseignement,UniversityofLorraine,6-8December.

Ho,A.,Watkins,D.andKelly,M.(2001) TheConceptualChangeApproachtoImprovingTeachingandLearning:AnEvaluationofaHongKongStaffDevelopmentProgramme.Higher Education,42(2),143-169.

Holton,E.F.IIIandBaldwin,T.T.(2000)Makingtransferhappen:Anactionperspectiveonlearningtransfersystems.Advances in Developing Human Resources. 2(4),1-6.

Howland,J.andWedman,J.(2004)Aprocessmodelforfacultydevelopment:individualizingtechnologylearning.Journal of Technology and Teacher Education.12(2),239-263.

Hoy,A.W.(2008)Teacher’s Sense of Efficacy Scale.Columbus,OH:TheCollegeofEducationandHumanEcology,TheOhioStateUniversity.Availablefrom:http://people.ehe.osu.edu/ahoy/research/instruments/#Sense[accessed14August2012].

Jarvinen,A.andKohonen,V.(1995)Promotingprofessionaldevelopmentinhighereducationthroughportfolioassessment.Assessment and Evaluation in Higher Education.20(1),25-36.

Kahn,J.andPred,R.(2002)Evaluationofafacultydevelopmentmodelfortechnologyuseinhighereducationforlateadopters.Computers in the School.18(4),127-150.

Kane,R.,Sandretto,S.andHeath,C.(2002)TellingHalftheStory:ACriticalReviewofResearchontheTeachingBeliefsandPracticesofUniversityAcademics.Review of Educational Research. 72(2),177-228.

48

Kirkpatrick,D.L.(1998)Evaluating Training Programmes. Seconded.SanFrancisco:Berrett-Koehler.

Kreber,C.andBrook,P.(2001)ImpactEvaluationofEducationalDevelopmentProgrammes.International Journal for Academic Development.6(2),96-107.

Kupritz,V.W.(2002)Therelativeimpactofworkplacedesignontrainingtransfer.Human Resource Development Quarterly.13(4),427-447.

Levinson-Rose,J.andMenges,R.J.(1981)Improvingcollegeteaching:Acriticalreviewofresearch.Review of Educational Research.51,403-434.Lindblom-Ylänne,S.andHämäläinen,K.(2004)TheBolognaDeclarationasaTooltoEnhanceLearningandInstructionattheUniversityofHelsinki.International Journal for Academic Development.9(2),153-165.

Lindblom-Ylänne,S.,Trigwell,K.,Nevgi,A.andAshwin,P.(2006)Howapproachestoteachingareaffectedbydisciplineandteaching.Studies in Higher Education. 31(3),285-298.

Lueddeke,G.(2003)Professionalisingteachingpracticeinhighereducation:astudyofdisciplinaryvariationand‘teaching-scholarship’.Studies in Higher Education.28(2),213-228.

McAlpine,L.(2003)Hetbelangvanonderwijskundigevormingvoorstudentgecentreerdonderwijs:depraktijkgeëvalueerd.Theimportanceofinstructionaldevelopmentforstudentcentredteaching:Anexaminationofpractice.In:Druine,N.,Clement,M.andWaeytens,K.(eds.)Dynamiek in het hoger onderwijs. Uitdagingen voor onderwijsondersteuning(Dynamics in higher education: Challenges for teaching support).Leuven:UniversitairePers,pp.57-71.

McArthur,J.,Earl,S.andEdwards,V.(2004)Impact of Courses for University Teachers.PaperpresentedtotheAustralianAssociationforResearchinHigherEducation,Melbourne,Australia.

McNay,I.(1995)Fromthecollegialacademytothecorporateenterprise:thechangingcultureofuniversities.In:Schuller,T.(ed.)The Changing University?Buckingham:SRHEandOpenUniversityPress.

Mahoney,C.(2012)ProfessorCraigMahoneyonlearningandteachingforhighereducationtoday.The Guardian,26July.Availablefrom:http://www.guardian.co.uk/higher-education-hea-partner-zone/learning-and-teaching-craig-mahoney?newsfeed=true[lastaccessedAugust2012].

Marsh,H.W.(1982) SEEQ:AReliable,ValidandUsefulInstrumentforCollectingStudents’EvaluationsofUniversityTeaching.British Journal of Educational Psychology.52(1),77-95.

Marshall,B.(2004)LearningfromtheAcademy:Frompeerobservationofteachingtopeerenhancementoflearningandteaching.The Journal of Adult Theological Education.1(2),185-204.

Marton,F.andSäljö,R.(1976)OnqualitativedifferencesinlearningI.Outcomeandprocess.British Journal of Educational Psychology.46(1),4-11.

49

Ménard,L.,Legault,F.,BenRhouma,T.,Dion,J.-S.andMeunier,H.(2011)LaFormationàl’EnseignementauPostsecondaire;Mesurerseseffetssurlesenseignantsetlesétudiants.In:Actes de VI ième colloque ‘Questions de pedagogies dans l’enseignement supérieur Les courants de la professionalization: enjeux, attentes, changements.2vols.Angers:ISSBA,pp.195-206.

Nasmith,L.,Saroyan,A.,Steinert,Y.,Lawn,N.andFranco,E.D.(1995)Long-term Impact of Faculty Development Workshops.Montreal,QC:McGillUniversity.

Nasr,A.,Gillett,M.andBooth,E.(1996)Lecturers’teachingqualificationsandtheirteachingperformance.Research and Development in Higher Education.18,576-581.

Orr,K.(2008)RoomforImprovement?TheImpactofCompulsoryProfessionalDevelopmentforTeachersinEngland’sFurtherEducationSector.Journal of In-Service Education.34(1),97-108.

Parsons,D.J.,Hughes,J.andWalsh,K.(2010)Initial Training and Professional Development of Teachers and Trainers beyond Upper-secondary Education.PublicationsOfficeoftheEuropeanUnion,Cedefop.

Parsons,D.J.,Bloomfield,A.,Burkey,S.andHolland,J.(2011)Impact Evaluation of the UK Teaching and Learning Research Programme.Swindon:EconomicandSocialResearchCouncil.

Pawson,R.andTilley,N.(1997)Realistic Evaluation.London:Sage.

Persellin,D.andGoodrick,T.(2010)Facultydevelopmentinhighereducation:Long-termimpactofasummerteachingandlearningworkshop.Journal of the Scholarship of Teaching and Learning.10(1),1-13.

Pintrich,P.R.,Smith,D.A.F.andMcKeachie,W.J.(1989)A manual for the use of the Motivated Strategies for Learning Questionnaire (MSLQ).AnnArbor,MI:NationalCenterforResearchtoImprovePostsecondaryTeachingandLearning,UniversityofMichigan.

Postareff,L.(2007)Teaching in Higher Education From Content-focused to Learning-focused Approaches to Teaching.ResearchReport214.Helsinki:UniversityofHelskinki,DepartmentofEducation.

Postareff,L.,Lindblom-Ylänne,S.andNevgi,A.(2007)Theeffectofpedagogicaltrainingonteachinginhighereducation.Teaching and Teacher Education.23(5),557-571.

Postareff,L.,Lindblom-Ylänne,S.andNevgi,A.(2008)Afollow-upstudyoftheeffectofpedagogicaltrainingonteachinginhighereducation.Higher Education: The International Journal of Higher Education and Educational Planning.56(1),29-43.

Prebble,T.,Hargreaves,H.,Leach,L.,Naidoo,K.,Suddaby,G.AndZepke,N.(2004),Impact of Student Support Services and Academic Development Programmes on Student Outcomes in Undergraduate Tertiary Study: A synthesis of the research,(MinistryofEducation,NewZealand)

Prosser,M.andMillar,R.(1989)Thehowandwhatoflearningphysics.The European Journal of Psychology of Education.4,513-528.

Prosser,M.andTrigwell,K.(1998)Understanding Learning and Teaching: The Experience in Higher Education.MiltonKeynes:OpenUniversityPress.

50

Prosser,M.,Rickinson,M.,Bence,V.,Hanbury,A.,&Kulej,M.(2006).Formative evaluation of accredited programmes.HigherEducationAcademy.http://www.heacademy.ac.uk/resources/detail/publications/formative_evaluation_of_accredited_programmes_may06

Ramsden,P.(1991)APerformanceIndicatorofTeachingQualityinHigherEducation:TheCourseExperienceQuestionnaire.Studies in Higher Education. 16(2),129-150.

Ramsden,P.(1992)Learning to Teach in Higher Education.London:Routledge.

Roche,L.A.andMarsh,H.W.(2000)Multipledimensionsofuniversityteacherself-concept.Instructional Science.28(5-6),439-468.

Romano,J.L.,Hoesing,R.,O’Donovan,K.andWeinsheimer,J.(2004)FacultyatMid-Career:AProgramtoEnhanceTeachingandLearning.Innovative Higher Education.29(1),21-48.

Roxå,T.andMårtensson,K.(2009)Significantconversationsandsignificantnetworks–exploringthebackstageoftheteachingarena.Studies in Higher Education.34(5),547-559.

Roxå,T.,Mårtensson,K.andAlveteg,M.(2011)Understandingandinfluencingteachingandlearningculturesatuniversity:anetworkapproach.Higher Education.62(1),99-111.

Rust,C.(1998)Theimpactofeducationaldevelopmentworkshopsonteachers’practice.The International Journal for Academic Development.3(1),72-80.

Schreurs,M.L.andVanVilet,J.(1998)Theimprovementofstudentskillsinaproblem-basedcurriculum.In:Rust,C.(ed.)Improving Student Learning: Improving Students as Learners.Oxford:OxfordCentreforFacultyandLearningDevelopment,pp.154-159.

TheScottishGovernment(2012)Putting Learners at the Centre: Delivering our Ambitions for Post-16 Education.Edinburgh:ScottishGovernment.Availablefrom:http://www.scotland.gov.uk/Resource/Doc/357943/0120971.pdf[lastaccessedAugust2012].

Sowers,J.-A.andSmith,M.R.(2003)AFieldTestoftheImpactofanInserviceTrainingProgramonHealthSciencesEducationFaculty.Journal of Postsecondary Education and Disability.17(1),33-48.

Steinert,Y.,Mann,K.,Centeno,A.,Dolmans,D.,Spencer,J.,Gelula,M.andPrideaux,D.(2006)Asystematicreviewoffacultydevelopmentinitiativesdesignedtoimproveteachingeffectivenessinmedicaleducation:BEMEGuideNo.8.Medical Teacher.28(6),497-526.

Stes,A.,Clement,M.andVanPetegem,P.(2007)TheEffectivenessofaFacultyTrainingProgramme:Long-termandinstitutionalimpact.International Journal for Academic Development.12(2),99-109.

Stes,A.,Min-Leliveld,M.,Gijbels,D.andVanPetegem,P.(2010a)Theimpactofinstructionaldevelopmentinhighereducation:Thestate-of-the-artoftheresearch.Educational Research Review.5(1),25-49.

Stes,A.,Coertjens,L.andVanPetegem,P.(2010b)Instructionaldevelopmentforteachersinhighereducation:impactonteachingapproach.Higher Education.60(2),187-204.

51

Stes,A.,DeMaeyer,S.,Gijbels,D.andVanPetegem,P.(2012)Instructionaldevelopmentforteachersinhighereducation:Effectsonstudents’perceptionsoftheteaching–learningenvironment.British Journal of Educational Psychology.82(3),318-419.

Thépaut,A.andLemaitre,D.(2011)Historique de colloque.Availablefrom:http://www.colloque-pedagogie.org/workspaces/presentation/historique;lastaccessed10October2012.

Trigwell,K.(1995)Increasingfacultyunderstandingofteaching.In:Wright,W.A.(ed.)Teaching Improvement Practices: Successful Faculty Development Strategies.NewYork:Anker.

Trigwell,K.(2012)Evaluatingtheimpactofuniversityteachingdevelopmentschemes.In:Simon,E.andPleshova,G.(eds.)Teacher Development in Higher Education.London:Routledge,pp.257-273.

Trigwell,K.andProsser,M.(1991)Improvingthequalityofstudentlearning:theinfluenceoflearningcontextandstudentapproachestolearningonlearningoutcomes.Higher Education(SpecialEditiononStudentLearning).22(3),251-266.

Trigwell,K.andProsser,M.(1996)Congruencebetweenintentionandstrategyinuniversityscienceteachers’approachtoteaching.Higher Education.32(1),77-87.

Trigwell,K.andProsser,M.(2004)DevelopmentandUseoftheApproachestoTeachingInventory.Educational Psychology Review.16(4),409-424.

Trigwell,K.,Prosser,M.andWaterhouse,F.(1999)Relationsbetweenteachers’approachestoteachingandstudents’approachestolearning.Higher Education.37(1),57-70.

Trigwell,K.,Martin,E.,Benjamin,J.andProsser,M.(2000)ScholarshipofTeaching:Amodel.Higher Education Research & Development.19(2),155-168.

Trowler,P.andBamber,R.(2005)CompulsoryHigherEducationTeacherTraining:Joined-uppolicies,institutionalarchitecturesandenhancementcultures.International Journal for Academic Development.10(2),79-93.

VanNoteChism,N.andSzabó,B.(1997)Howfacultydevelopmentprogramsevaluatetheirservices.Journal of Faculty, Programme and Organisational Development. 15(2),55-62.

VanRossum,E.J.andSchenk,S.M.(1984)Therelationshipbetweenlearningconception,studystrategyandlearningoutcome.British Journal of Educational Psychology.54(1),73-83.

Weimer,M.(1990)Improving College Teaching.SanFrancisco:Jossey-Bass.

Weimer,M.andLenze,L.F.(1998)Instructionalinterventions:Areviewoftheliteratureoneffortstoimproveinstruction.In:Perry,R.andSmart,J.(eds.)Effective teaching in higher education.NewYork:AgathonPress,pp.205-240.

Contact us

The Higher Education Academy Innovation Way York Science Park Heslington York YO10 5BR

+44 (0)1904 717500 enquiries@heacademy.ac.uk

ISBN: 978-1-907207-61-7

© The Higher Education Academy, 2012

The Higher Education Academy (HEA) is a national body for learning and teaching in higher education. We work with universities and other higher education providers to help bring about change in learning and teaching. We do this to improve the experience that students have while they are studying, and to support and develop those who teach them.

Our activities focus on rewarding and recognising excellence in teaching, bringing together people and resources to research and share best practice, and by helping to influence, shape and implement policy - locally, nationally, and internationally.

www.heacademy.ac.uk | Twitter @HEAcademy

No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or any storage and retrieval system without the written permission of the Editor. Such permission will normally be granted for educational purposes provided that due acknowledgement is given.

To request copies of this report in large print or in a different format, please contact the communications office at the Higher Education Academy: 01904 717500 or pressoffice@heacademy.ac.uk