Student Engagement Survey Middle East (SESME) Implementation Workshop

Post on 25-Feb-2016

28 views 1 download

Tags:

description

Student Engagement Survey Middle East (SESME) Implementation Workshop. Smarter Learning Symposia Shangri-La Hotel, Dubai Wednesday 24 April 2013. Professor Hamish Coates & Dr Sarah Richardson highereducation@acer.edu.au. Linking global insights into higher learning. Career-ready. - PowerPoint PPT Presentation

Transcript of Student Engagement Survey Middle East (SESME) Implementation Workshop

Student Engagement Survey Middle East (SESME) Implementation Workshop

Smarter Learning SymposiaShangri-La Hotel, Dubai

Wednesday 24 April 2013

Professor Hamish Coates & Dr Sarah Richardsonhighereducation@acer.edu.au

Linking global insights into higher

learning

Educational principles

Academic standards

Innovation

EvidenceProductivity

Value added

ParticipationOutcomes

Work readinessCurriculum

Capabilities

Student focusedFocused support

Pathways

Career-ready

Multi-disciplinary

Global education

Human potential

that identifies key indicators of quality – the things that really count

sets externally referenced and context-relevant standards of performance

collects quantitative data on performance

uses that data to highlight areas of strength and improve areas of weakness

provides information to potential students in an inspiring fashion

assures the public that minimum standards of performance are being met

Imagine an institution…

Plan

Act

Evaluate

Improve

Hunch

institutional, educational, epistemological, conceptual, disciplinary, industrial, transnational, commercial, cultural, professional, practical, supranational, universal, ontological, metaphysical, pedagogical, situational, organisational, interpersonal, historical, aesthetic, political…

Little data Happiness data

Effectiveness data

Commitment to

effectiveness

UniversalElite Mass

Spotting areas of risk

Producing a cultivating climate

Profiling groups

Responding to individuality

Identifying unexpectedness

Stimulating change

Tracking change

Change perspective

s

An emerging global measure

Global benchmarking: student engagement measured systematically in United States, Canada, Australia, New Zealand, Ireland, United Kingdom, China, Japan, South Africa, etc.

Nearly 4 million students participated in the National Survey of Student Engagement (NSSE)

Over 1,500 North American colleges and universities participated in NSSE

More than 150,000 students have participated in the Australasian survey of student engagement (AUSSE)

After lunch…

Refine definition of ‘student engagement’ and how we will measure this in the Middle East

Look at reporting for institutions

Review approach to pilot implementation in 2013-14

Chart next steps

Continuing consultative developmentInitial meetings in Qatar in November 2011 and October 2012

Constitute Advisory Group to oversee development

Establish communications architectures

Students, staff, institutions and other stakeholders will provide feedback

Invite institutions to participate in the pilot survey fieldwork

What is student engagement?Student-centred perspective of learning

Encompasses both academic and ‘beyond classroom’ activities and conditions

Assumption that individuals learn and develop through involvement with key educational practices and activities

Grounded in decades of research

Links between student engagement and retention, completion and positive learning outcomes

Developing questionnaire for the pilotMESEQ will be based on current version of NSSE

MESEQ will be adapted to be appropriate for the Middle East context

Feedback will be sought from the sector and Advisory Group to further develop the MESEQ

Survey will be discussed with students in focus groups, interviews and small scale tests

Based on feedback and findings from consultation and focus groups, MESEQ will be further revised

Shaping aspirations

Admission and

integration

Involvement and

retention

Graduate transitions

Higher-Order Thinking

General LearningGeneral

Development Career Readiness

Average Overall Grade

Departure IntentionOverall Satisfaction

Future Intentions

Academic ChallengeActive Learning

Student and Staff Interactions

Enriching Educational Experiences

Supportive Learning Environment

Work Integrated Learning

Academic Challenge

Extent to which expectations and assessments challenge students to learn

Time spent preparing for classAmount of reading and writingInstitutional expectationsFocus of coursework

Academic Challenge

Hours spent preparing for class

2%

31%

26%

16%

11%

6%4%

5%2%

32%

25%

14%

10%

6%4%

6%

0%

5%

10%

15%

20%

25%

30%

35%

None 1 to 5 6 to 10 11 to 15 16 to 20 21 to 25 26 to 30 Over 30

Per c

ent r

espo

nse

Hours per week

First year Later year

Active Learning

Students’ efforts to actively construct their knowledge

Asking questions/contributing to discussionsGiving presentationsWorking with other studentsParticipating in community-based learning

13%

15%

40%

51%

52%

54%

57%

0% 10% 20% 30% 40% 50% 60% 70%

Tutored other students

Community based project

Made presentation

Worked with students during class

Worked with students outside class

Discsused ideas from classes

Asked questions

Per cent response

Active Learning

Frequently participated in active learning

Student and Staff Interactions

Level and nature of students’ contact with teaching staff

Receiving feedbackDiscussing grades and assignmentsDiscussing ideas from classesDiscussing career plansWorking with teaching staff

Student and Staff Interactions

‘Never’ interacted with staff

13%

31%

49% 50%

70%

94%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Received feedback

Discussed grades Discussed ideas from classes

Talked about career plans

Worked on other activities with

staff

Worked on a research project

with staff

Per c

ent r

espo

nse

Enriching Educational Experiences

Participation in broadening educational activities

Interacting with people of different backgroundsParticipating in extracurricular activitiesTaking part in a practicum or internshipDoing volunteer workStudying a foreign languageParticipating in a learning community

10%

19%

24%

13%

3% 4%1%

31% 31% 31%

16%

8% 9%7%

0%

5%

10%

15%

20%

25%

30%

35%

Practicum or internship

Volunteer work Learning community or study group

Study foreign language

Study abroad program

Independent study

Culminating final year program

Per c

ent r

espo

nse

First year Later year

Enriching Educational Experiences

Participated in broadening activities

Supportive Learning Environment

Students’ feelings of support within the university community

Quality of relationships with students and staffAcademic support provisionNon-academic support provisionSupport to socialise

Supportive Learning Environment

Quality of relationships

28%

20%

14%

0%

5%

10%

15%

20%

25%

30%

Poor 2 3 4 5 6 Excellent

Per c

ent r

espo

nses

Other students Teaching staff Administrative personnel and services

Demographics and contexts

Demographic and context questionsStudents’ genderYear levelMode and type of studyDisciplineResidency/citizenship statusHome languageEthnicityLiving arrangementsAge

50

44

27 27

55

49

31

22 22

51

0

10

20

30

40

50

60

Academic Challenge Active Learning Student Staff Interactions

Enriching Educational Experiences

Supportive Learning Environment

Aver

age

scal

e sc

ores

Campus-based students External-distance students

Demographics and contexts

Average engagement by mode of study

38

38

41

41

42

42

44

45

46

48

0 10 20 30 40 50 60

Agriculture

Humanities

IT

Science

Business

Engineering

Health

Education

Creative arts

Architecture

Average Active Learning score

Demographics and contexts

Average engagement by field of study

Localisation for equivalencySource

Translation

National review

Verification

National review

Final check

Localisation for equivalency

Shaping continuing improvement…?

Your reactions to the current draft instrument?

Further improvements you recommend?

Language and cultural issues?

Value of different questions?

How is data on engagement used?Responding to accountability and transparency

Assessing and improving student learning, curriculum and teaching practices

Creating partnerships through benchmarking

Improving student retention and success

Institutional audits and performance reviews

Evidence-based conversations about improving student engagement

Do more for

learning

Benchmarking results

Within an individual institution

Between individual institutions

Between groups of institutions

With national level findings

Benchmarking between institutions

Average Active Learning scale scores

0

10

20

30

40

50

60

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37

Aver

age

Activ

e Le

arni

ng s

core

s

Benchmarking between countries

Average scale scores

0

10

20

30

40

50

60

70

Academic Challenge Active Learning Student Staff Interactions

Enriching Educational Experiences

Supportive Learning Environment

Work Integrated Learning

Aver

age

scal

e sc

ores

Australian university bachelor students New Zealand university bachelor students

South African university bachelor students USA and Canadian university bachelor students

Pilot fieldwork

Following finalisation of MESEQ survey will be piloted with participating institutions

Pilot survey will be conducted in English and Arabic

Survey will be conducted primarily online with some paper survey forms used as required

All data returned to ACER for processing and reporting

Psychometric analysis and reporting

Following fieldwork, ACER prepares overall data file

ACER conducts psychometric analysis of the survey and its scales

Each participating institution receive its own data file and customised benchmark report

Overall report will be prepared and published

Review of SESME pilot

Based on the psychometric analyses some changes may be made to the instrument

Based on feedback from participating institutions, survey methods will be refined

All participating institutions will be asked to provide feedback on their experience of being involved in the pilot

All feedback and analyses will feed into the development of the next SESME

SESME pilot timeline2013 2014

APR MAY JUN JUL AUG SEP OCT NO

V DEC JAN FEBSmarter Learning Symposia 24

Consultation on instrumentFocus groups

Finalise instrument

Language adaptation and translationPre-pilot preparations

Pilot fieldwork

Students participate in surveyData file and report preparation Review of SESME pilot