Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica &...

73
1 Predictability in the Irish Leaving Certificate Examination Working Paper 3: Student Questionnaire Daniel Caro and Therese Hopfenbeck This research was sponsored by the State Examinations Commission (SEC) of Ireland. Ruairí Quinn, Minister for Education and Skills in Ireland, announced this project and his commitment to tackle any problematic predictability in the Leaving Certificate examinations. 1 Contents Introduction............................................................................................................................................. 3 Survey and data ....................................................................................................................................... 3 Survey development ........................................................................................................................... 4 Learning strategy items .................................................................................................................. 4 Views on predictability items ......................................................................................................... 4 Piloting the instrument and quality checking ................................................................................. 5 Survey versions ................................................................................................................................... 5 Paper-and-pencil survey ................................................................................................................. 5 Online survey .................................................................................................................................. 5 Sample ................................................................................................................................................. 6 Data management............................................................................................................................... 7 Scale development .................................................................................................................................. 7 Methodology ....................................................................................................................................... 7 Statistical models ............................................................................................................................ 7 Number of factors in EFA................................................................................................................ 7 Results ................................................................................................................................................. 8 Learning strategies.......................................................................................................................... 8 Views on predictability ................................................................................................................. 14 Learning support ........................................................................................................................... 18 Family SES ..................................................................................................................................... 23 Analysis of research questions .............................................................................................................. 26 Research question 3 – how predictable are examination questions in the Leaving Certificate in Ireland? ............................................................................................................................................. 26 Research question 4 – which aspects of this predictability are helpful and which engender unwanted approaches to learning? .................................................................................................. 27 Research question 7 – what kinds of examination preparation strategies do students use? .......... 31 Learning strategies........................................................................................................................ 31 1 Department of Education and Skills (2013) Supporting a better transition from second level to Higher Education: Key directions and next steps. 27 March. (http://www.education.ie/en/The-Department/Re-use-of-Public-Sector- Information/Library/Announcements/Supporting-a-Better-Transition-from-Second-Level-to-Higher-Education.html)

Transcript of Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica &...

Page 1: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

1

Predictability in the Irish Leaving Certificate Examination

Working Paper 3: Student Questionnaire

Daniel Caro and Therese Hopfenbeck

This research was sponsored by the State Examinations Commission (SEC) of Ireland. Ruairí

Quinn, Minister for Education and Skills in Ireland, announced this project and his commitment

to tackle any problematic predictability in the Leaving Certificate examinations.1

Contents

Introduction ............................................................................................................................................. 3

Survey and data ....................................................................................................................................... 3

Survey development ........................................................................................................................... 4

Learning strategy items .................................................................................................................. 4

Views on predictability items ......................................................................................................... 4

Piloting the instrument and quality checking ................................................................................. 5

Survey versions ................................................................................................................................... 5

Paper-and-pencil survey ................................................................................................................. 5

Online survey .................................................................................................................................. 5

Sample ................................................................................................................................................. 6

Data management ............................................................................................................................... 7

Scale development .................................................................................................................................. 7

Methodology ....................................................................................................................................... 7

Statistical models ............................................................................................................................ 7

Number of factors in EFA ................................................................................................................ 7

Results ................................................................................................................................................. 8

Learning strategies.......................................................................................................................... 8

Views on predictability ................................................................................................................. 14

Learning support ........................................................................................................................... 18

Family SES ..................................................................................................................................... 23

Analysis of research questions .............................................................................................................. 26

Research question 3 – how predictable are examination questions in the Leaving Certificate in

Ireland? ............................................................................................................................................. 26

Research question 4 – which aspects of this predictability are helpful and which engender

unwanted approaches to learning? .................................................................................................. 27

Research question 7 – what kinds of examination preparation strategies do students use? .......... 31

Learning strategies........................................................................................................................ 31

1 Department of Education and Skills (2013) Supporting a better transition from second level to Higher Education: Key directions

and next steps. 27 March. (http://www.education.ie/en/The-Department/Re-use-of-Public-Sector-

Information/Library/Announcements/Supporting-a-Better-Transition-from-Second-Level-to-Higher-Education.html)

Page 2: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

2

Learning support ........................................................................................................................... 33

Regression analysis ................................................................................................................................ 33

Examination scores model ................................................................................................................ 34

Predictability model .......................................................................................................................... 38

References ............................................................................................................................................. 41

Appendix A: Questionnaire ................................................................................................................... 43

Appendix B: Summary Tables ................................................................................................................ 54

Views on predictability ...................................................................................................................... 54

Overall results ............................................................................................................................... 54

Views on the exam by gender ...................................................................................................... 56

Views on the exam and family SES ............................................................................................... 59

Views on the exam and exam results ........................................................................................... 59

Learning strategies ............................................................................................................................ 61

Overall results ............................................................................................................................... 61

Learning strategies by gender ...................................................................................................... 63

Learning strategies and family SES ............................................................................................... 66

Learning strategies and exam results ........................................................................................... 67

Learning support ............................................................................................................................... 70

Overall results ............................................................................................................................... 70

Learning support by gender .......................................................................................................... 70

Learning support and exam results .............................................................................................. 72

Page 3: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

3

Introduction

This working paper is part of a broader investigation on the predictability of the Irish Leaving

Certificate (LC) examination. This research was sponsored by the State Examinations Commission

(SEC) of Ireland, as part of the Department of Education and Skills (DES) (2013) policy, Supporting a

better transition from second level to higher education: key directions and next steps. Overall, the

research involved:

1. A review of the international research literature

2. Analysis of the media coverage of the Leaving Certificate examinations in 2012 and 2013

3. Empirical work on the examinations materials from 2003 to 2012

4. A survey of 1,002 students’ views

5. Interviews with 70 teachers and 13 group interviews with students

This working paper is concerned with item 4. It provides a technical guide for understanding the

student survey, the derived dataset, and reports results of the analysis of the research questions

posed by the project that can be addressed with the questionnaire dataset. The technical guide

explains the methodological procedures involved in the development and administration of the

student questionnaire, the creation of the dataset, the analytic sample, and the development of

scales reflecting predictability views, learning strategies, learning support, and family socio-

economic status of students. The analysis of the research questions draw on the questionnaire

dataset, derived scales, and the examination scores2 of students. The following research questions

are analysed:

• Research question 3 – how predictable are examination questions in the Leaving Certificate

in Ireland?

• Research question 4 – which aspects of this predictability are helpful and which engender

unwanted approaches to learning?

• Research question 7 – what kinds of examination preparation strategies do students use?

The paper is organised as follows. The first section explains the survey development, the survey

versions, the sample, and the data preparation. The second section describes the techniques and

results of the scale development. The third section reports main results of the analysis of the

research questions. Finally, the fourth section presents additional results of regression models of

the examination scores and the predictability scales. Appendix A presents the student

questionnaire. Appendix B reports more detailed results related to the research questions (eg

results by gender).

Survey and data

The questionnaire was developed based upon previous research instruments and a literature

review on predictability. All the items have been adapted for the Irish Leaving Certificate and the

Irish context.

2 The SEC provided data on grade levels, which was transformed into points. These are referred to as examination scores in this

working paper.

Page 4: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

4

Survey development

The survey on the Leaving Certificate consisted of six sections. Section A asked for background

information such as gender, plans for the future, language spoken at home, while section B asked

for background information about parents' education, work and home possession, to be able to

measure family cultural and socio-economic status (SES). Section D asked students to report their

use of subject-specific learning strategies when they were preparing for the leaving certificate.

Sections A, B and D used adapted items from the Programme for International Student Assessment

(PISA), while sections C, E and F included newly developed items for this study. Section C asked

students to indicate which subjects they were sitting for the leaving certificate, and at which level

(higher or ordinary). Section E asked students to report their experience and views of the exam.

Finally, section F asked students to answer questions of learning support for the exam, such as use

of grind schools and family support.

Learning strategy items

Most items and scales for the learning strategies have been taken from already well-researched

instruments such as the student approaches questionnaire used in PISA (Marsh, Haug, Artelt &

Baumert, 2006). The PISA instrument measures two separate categories, cognitive strategies and

metacognitive strategies, and the items have been based upon Weinstein and Meyer’s taxonomy

(1986), Learning and Study Strategies Inventory – High School Version (LASSI-HS) (Weinstein,

Zimmerman & Palmer 1988; Weinstein & Palmer, 1990), which is one of the most widely used

learning strategy questionnaires in the world, and the Motivated Strategy for Learning

Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this

questionnaire for measuring learning strategies at a global level is well-known (Allan, 1997) and the

PISA instrument has been criticised for generalising students’ strategy use across a number of

subjects and contexts (Samuelstuen & Braten, 2007). We therefore asked students to rate their use

of learning strategies specifically in relation to three subjects: biology, English and geography, using

a four- point Likert scale from (1) almost never, (2) now and then, (3) often, to (4) always (see

question 11, Appendix A). We included three categories of learning strategies. The first one,

memorisation strategies, such as ‘I tried to learn my notes by heart’, is particularly useful for simple

tasks, and involves repeating the material, reciting and copying the material (Pintrich, Smith, Garcia

& McKeachie, 1991). The second category, elaboration strategies, such as the item ‘I tried to relate

new information to knowledge from other subjects’, involves making meaningful connections to

the learners prior knowledge, while the last category, control strategies, such as the item ‘I checked

if I had understood what I had read’, involves being able to monitor your own learning and adapt

and adjust strategies if needed (Weinstein et al, 2000; Weinstein & Meyer, 1991).

Views on predictability items

A number of items were developed to reflect the experience of students in taking the exam,

including their views on the predictability of the exam. Students were asked a total of ten

questions relating to the English, biology and geography exams (see question 12, Appendix A). They

had to rate their level of agreement (ie strongly disagree (1), disagree (2), agree (3) or strongly

agree (4)) with different statements relating to each subject. Items measuring predictability

included statements such as ‘I predicted the exam questions well and I was surprised by the

questions on the exam this year’. The survey also included questions asking students about their

Page 5: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

5

views about learning, with items such as ‘The exam tests the right kind of learning’ and ‘To do well

in this exam, remembering is more important than understanding’. The idea of including these

items was to further explore whether predictability is linked to students’ views about learning, and

whether they felt that remembering was more important than understanding for some of the

subjects. In addition, students were asked to indicate what kinds of support for their learning they

had in English, biology and geography, with items such as ‘Which topics were likely to come up was

explained to me’, ‘Model answers were given to me’ and ‘My parents helped me with my studies’.

Questions around grind schools and use of revision apps to support students’ learning were also

included.

Piloting the instrument and quality checking

One version of the instrument was piloted with two Irish students who had previously taken the

Leaving Certificate. First, they answered the whole survey. Second, the researcher carried out

cognitive interviews to have feedback on each item. The cognitive interviews involved asking the

participant to (a) read the question (b) explain what it means, (c) read the answer option and chose

an answer and (d) explain the reason for the answer (Karabenick et al, 2007). Based upon these

interviews, several of the items were revised to make it more suitable for an Irish context. For

example, instead of using the term ‘police officer’ when asking about parental occupational status,

we were advised to use the Irish word Garda. In items asking for classical literature, we included

Yeats instead of Shakespeare.3 A final version of the survey was given to the research team and

four Dphil students for feedback on wording and layout. Minor revisions were conducted before

sending to SEC for additional feedback. Based upon these review processes, a final version of the

Leaving Certificate survey ended up with a ten-page questionnaire, in six sections.

Survey versions

A paper-and-pencil version and an online version of the survey were prepared in England and in

Irish.

Paper-and-pencil survey

For maximum participation, it was decided that the paper-and-pencil survey should not be more

than ten pages long. The first page included a text with information of the purpose of the study and

general information on confidentiality. Students were asked to write their exam number and

further they were asked for the researchers’ permission to link their exam score to the survey

results. Students were also informed of a prize draw for five Apple iPads if they completed the

survey (see Appendix A).

Online survey

An online version of the survey was posted on the Oxford University Centre of Educational

Assessment’s homepage on 4 June until 1 August 2013 (http://oucea.education.ox.ac.uk/about-

us/oucea-commissioned-to-conduct-independent-external-evaluation-of-predictability-in-irish-

leaving-certificate-examinations). In addition, posters with information of the online version were

present in all the 100 schools that participated in the study, so students could choose between

completing a paper or online survey after they had finished their exams.

3 This item is taken from the PISA test which uses the question ‘Which of the following are in your home?’, with the option

‘Classical literature (for example Shakespeare)’.

Page 6: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

6

Sample

After excluding 31 schools because they were listed as having no students in LC year 2 in the DES

data, a list of 690 schools in Ireland was sent to the research team by SEC. These 690 schools

included 79 community schools, 14 comprehensive schools, 375 secondary schools and 223

vocational schools. Further, 108 of them were boys’ schools, 140 were girls’ schools, and 442 were

mixed schools. From the list of 690 schools, 24 schools were selected for the fieldwork for

interviews, and these schools were not included in the survey, to avoid too much work for them.

From the list of the remaining 666 schools, 100 schools were selected to participate in the survey,

which is more than 10% of the schools in Ireland offering LC. The paper-and-pencil version was

printed and distributed by SEC to these 100 schools, so that students could participate in the

survey after they had taken the Leaving Certificate. Prepaid envelopes were offered to facilitate the

return of surveys. Posters giving information about the survey were present in the back of the

exam room, also encouraging students to take the test online if they would prefer this.

The combined sample of students who responded to the paper-and-pencil and online surveys

comprised 1,018 students. We removed 11 of the surveys, since a quality check showed that the

students had conducted both the paper and pencil survey as well as the online survey. Additionally,

five students were removed for having examination numbers with five digits, which means they

were not part of the target sample of LC candidates. A total of 1,002 were left in the sample. In

total, data from 147 surveys came from the online version and 855 surveys from the paper-and-

pencil version.

Analyses of participants’ examination scores in English, biology and geography indicated that the

sample had a wide spread of abilities, but higher performing students were represented more

frequently than in the general population of Leaving Certificate students, and results must be

interpreted in that context (see Table 1).

Table 1. Cumulative percentage at each grade: questionnaire sample compared

with population (Pop)

English Biology Geography

Sample Pop Sample Pop Sample Pop

A 14.4 9.7 22.3 14.4 15.4 8.7

B 43.6 36.4 55.5 41.7 51.6 38.1

C 81.9 76.1 77.8 69.6 85.6 75.3

D 99.1 98.3 94.3 91.7 99.1 97.2

E 99.9 99.9 99.2 98.2 100 99.8

F 99.9 100 100 99.7 100 100

NG 100 100 100 100 100 100

No. 624 33,279 449 23,436 312 19,762

Out of the sample of 1,002 students the analysis is concerned only with those students who took

the higher level LC exam. The numbers vary by subject. The final sample for the English analysis

includes 772 students, 557 for biology and 404 for geography.

Page 7: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

7

Data management

The research team developed a codebook and adjusted some of the codes after the first 100

surveys had been entered.

Two research assistants entered data using the statistical package SPSS 20. In addition, three

researchers each entered ten surveys to check how the responses matched the codebook and

discussed the coding with the research assistants. One of the few problems detected was that

respondents sometimes ticked more than one box for parents’ education level. Initially some of the

research assistants coded this as ‘invalid’, this was revised and the highest level of education was

recorded. Another challenge was the decoding of handwriting for the open question. It was very

common that the respondents did not write legibly. In case of doubt, research assistants discussed

interpretation of the handwriting.

After the data had been entered into SPSS, a quality check was conducted. One in every 20 surveys

from the paper-and-pencil test were double checked to see whether data was entered correctly.

Only two minor errors were found in a total of 39 tests which indicates an overall good quality of

data entry. The online version of the test had automatic data entry.

Scale development

Methodology

Statistical models

Exploratory factor analysis (EFA), the Rasch model, and the partial credit model were employed for

scale development (Masters & Wright, 1997; Rasch, 1960). EFA was applied to the Likert-type items

surveying learning strategies (see question 11, Appendix A) and experiences in taking the exam

(see question 12, Appendix A). The Rasch model was applied to binary data of learning support (see

question 13, Appendix A), and the partial credit model was applied to the binary and ordinal data

on family SES. The Rasch model and the partial credit model assumed that the item data could be

represented by a single dimension. EFA employed different tests to determine the number of

factors to be retained. Missing data in number of factor tests and scale development was handled

using listwise deletion.

Number of factors in EFA

Determining the number of factors to retain is critical for scale development in EFA. If the number

of factors is underestimated or overestimated, the solution and interpretation of EFA results could

be significantly altered (Velicer, Eaton & Fava, 2000). For example, theoretically relevant scales may

be excluded if the number of factors is underestimated. Conversely, if the number of factors is

overestimated artificial scales may be produced.

Typically, analysts and statistical computer software employ Kaiser's (1960) rule of eigenvalues

greater than one, or scree visual tests proposed by Cattell (1966) for selecting the number of

factors to retain. Kaiser's rule in particular, due to its simplicity, is probably the most utilised

criterion for factor selection. This rule, however, has several problems. It has been argued that it

tends to overestimate the number of factors, that it was developed for principal component

Page 8: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

8

analysis and its use for EFA is unclear, and that it can produce trivial solutions in which a factor with

an eigenvalue of 1.01 is retained and one with an eigenvalue of 0.99 is not (Courtney, 2013;

Fabrigar et al, 1999). Scree visual tests, on the other hand, depend on the ability of the rater and

suffer from inherent inter-rater reliability and subjectivity. Researchers have proposed three

alternative statistical criteria that overcome these limitations (Courtney, 2013; Raiche, Roipel &

Blais, 2006).

The first is the optimal coordinate (OC) test, which determines the location of the scree by

measuring the gradients associated with eigenvalues and their preceding coordinates. Eigenvalues

are projected based on preceding eigenvalues using regression models. The number of principal

components to retain corresponds to the last observed eigenvalue that is superior or equal to the

estimated predicted eigenvalue. The second is the acceleration factor (AF) test, which puts

emphasis on the coordinate where the slope of the eigenvalue curve changes abruptly. The test is

based on the second derivative of the eigenvalue curve. The third is Horn's (1965) parallel analysis

(PA), which unlike the Kaiser's rule based on population statistics takes into account the proportion

of variance resulting from sampling error. The PA method generates a large number of data

matrices from random data in parallel with the real data. That is, the matrices have the same

number of cases and variables as the real data. Factors are retained in the real data as long as

eigenvalues are greater than the mean eigenvalue generated from the random data matrices.

These methods outperform the Kaiser's rule of retaining factors with eigenvalues greater than one

in simulation studies (Ruscio & Roche, 2012). In particular, PA is likely the most strongly

recommended technique but its application is not simple (Courtney, 2013). Recently, however,

these three tests have been implemented in the R package nFactors (Raiche, 2010). These tests

together with the Kaiser's rule are compared graphically in this paper for determining the number

of factors to retain.

Results

Learning strategies

EFA was applied to the learning strategies items in the areas of English, biology and geography (ie

question 11, except item f). Figure 1 presents for each subject area a comparison of tests of the

factors to retain by the optimal coordinates, the acceleration factor, the parallel analysis, and the

Kaiser's rule. The number of factors retained by each test is included in parenthesis. The different

tests, including parallel analysis, quite consistently indicated the presence of three factors in the

learning strategies across the three subject areas. The results are also consistent with the number

of factors proposed by Marsh et al (2006).

Page 9: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

9

Figure 1. Learning strategies: Tests to determine number of factors

Geography (n=381)

Table 2 reports loading factors (>0.3) for the EFA solution. Constituent items of theoretical

constructs are indicated with coloured circles. The empirical results reflected the latent structure of

three factors postulated by Marsh et al (2006): memorisation, elaboration, and control strategies.

The three factors have been labelled accordingly in Table 2.

In all three subjects items loaded on their corresponding theoretical constructs. Additionally, some

items loaded on two constructs. Two items loaded consistently in more than one construct. One is

item i, ‘I made sure that I remembered the most important points in the revision material’, which

was expected to reflect control strategies but also loaded on memorisation strategies in English

and biology. Since the item included the word ‘remember’, students may have answered thinking

more about memorisation strategies, even though the item in itself also involves a control strategy:

that the students exercised control when they made sure that they remembered. From theory, we

also know that some of the elaboration and control strategies overlap, and therefore crossloadings

on some of these items were expected. Another example is item h, which corresponds to

elaboration strategies but also loaded on the control strategies construct in biology and geography.

Also, item m in geography loaded on control strategies in addition to its corresponding

memorisation construct. But, in general, the factor structure is very consistent with Marsh et al

(2006).

English (n=750) Biology (n=540)

Page 10: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

10

Table 2. Learning strategies: EFA solution ( loadings)

English

(n=750)

Biology

(n=540)

Geography

(n=381)

memorisation elaboration control memorisation elaboration control memorisation elaboration control

k) I tried to memorise as much

of the revision material as

possible

0.76 • 0.82 • 0.63 •

e) I tried to learn my notes by

heart

0.66 • 0.60 • 0.65 •

a) I tried to memorise all the

material that I was taught

0.63 • 0.59 • 0.68 •

m) I tried to memorise what I

thought was important

0.59 • 0.56 • 0.41 • 0.39

g) I figured out how the

information might be useful in

the real world

0.71 • 0.69 • 0.55 •

c) I tried to relate new

information to knowledge

from other subjects

0.59 • 0.53 • 0.51 •

h) I tried to understand the

revision material better by

relating it to what I already

knew

0.58 • 0.58 • 0.34 0.52 • 0.34

n) I studied material that went

beyond what is expected for

the exam

0.39 • • 0.33 •

i) I made sure that I

remembered the most

important points in the

revision material

0.39 0.34 • 0.41 0.42 • 0.62 •

d) I checked if I understood 0.47 • 0.53 • 0.39 •

Page 11: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

11

English

(n=750)

Biology

(n=540)

Geography

(n=381)

what I had read

j) If I did not understand

something, I looked for

additional information to

clarify it

0.67 • 0.68 • 0.36 •

l) I tried to figure out which

ideas I had not really

understood

0.53 • 0.56 • 0.34 0.36 •

b) I started by figuring out

exactly what I needed to learn

• 0.32 • 0.40 •

Key: •= memorisation strategy construct •= elaboration strategy construct •= control strategy construct

Page 12: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

12

Table 3 reports alpha reliability coefficients for the theoretical constructs as well as the percentage of

the variance of the learning strategies data explained by the three constructs. Overall, the three factors

accounted for 52% of the variance in English, 55% in biology and 48% in geography. In the PISA 2000

study on reading literacy, alpha coefficients across countries ranged from 0.69 to 0.81 (Marsh et al

2006).4 The reliability estimates in our analysis are similar but on the low side.

Table 3. Learning strategies: Alpha coeffic ients and explained variance

English

(n=750)

Biology

(n=540)

Geography

(n=381)

memorisation 0.76 0.75 0.71

elaboration 0.68 0.69 0.56

control 0.64 0.69 0.62

% explained variance 52% 55% 48%

The distribution of the learning strategies scales for English, biology, and geography is presented in

Figures 2, 3 and 4.

Figure 2. Engl ish: distr ibution of learning strategies scales

4 The average reliability of the scales used in PISA 2000 varied between countries. Norway, the United States and Finland had higher

reliability (M Alphas = 0.81, 0.81, 0.81) while countries such as Latvia, Mexico and Brazil had lower reliabilities (M Alphas = 0.69, 0.70,

0.73).

0.0

0.1

0.2

0.3

0.4

-2 -1 0 1

Memorisation strategies scale (EFA)

De

nsity

0.0

0.2

0.4

-1 0 1 2

Elaboration strategies scale (EFA)

De

nsity

0.0

0.1

0.2

0.3

0.4

-2 -1 0 1 2

Control strategies scale (EFA)

De

nsity

Page 13: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

13

Figure 3. Biology: distr ibution of learning strategies scales

Figure 4. Geography: distr ibution of learning strategies scales

0.0

0.2

0.4

-3 -2 -1 0 1

Memorisation strategies scale (EFA)

De

nsity

0.0

0.1

0.2

0.3

0.4

-2 -1 0 1

Elaboration strategies scale (EFA)

De

nsity

0.0

0.2

0.4

0.6

-2 -1 0 1

Control strategies scale (EFA)

De

nsity

0.0

0.1

0.2

0.3

0.4

-2 -1 0 1

Memorisation strategies scale (EFA)

Den

sity

0.0

0.1

0.2

0.3

0.4

-2 -1 0 1 2

Elaboration strategies scale (EFA)

Den

sity

0.0

0.2

0.4

-3 -2 -1 0 1

Control strategies scale (EFA)

Den

sity

Page 14: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

14

Views on predictability

As with the learning strategies items, EFA was applied to the items reflecting views on predictability of

students (ie question 12 and item f of question 11). Tests to determine the number of factors were

applied to the Likert-scale type items (see Figure 5). Parallel analysis, the most reliable test, indicated

three factors in English and geography and four in biology. For consistency, it was decided to retain

three factors in every subject.

Figure 5. Views on predictabil ity: tests to determine number of factors

Geography (n=387)

Table 4 reports the EFA solution. The loading patterns were quite consistent across subjects with

practically no overlap between items and constructs. Only item h in geography loaded on two

constructs.

The first latent construct reflected views of students that they will be able to use what they have

learned for the future (item h), that they need to adapt what they know to do well in the exam (item d),

that the exam tests the right kind of learning (item c), that a broad understanding of the subject is

important to do well in the exam (item f), and that remembering is not more important than

understanding (item b). The second latent construct distinguished students who said they were able to

English (n=749) Biology (n=536)

Page 15: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

15

predict the exam questions well (item i), felt they knew what the examiners wanted this year (item a),

and were not surprised by the exam questions this year (item e). And the third construct indicated

students who chose not to study some topics as they thought they would not come up (item f from

question 11) and students who left a lot of topics out of their revision and still think they will do well

(item g). In general, it seemed the first factor reflected valuable learning views, the second factor

predictability views, and the third factor narrowing the curriculum views. Accordingly, these factors

have been labelled ‘valuable’, ‘predictable’, and ‘narrow’ in Table 4. The valuable factor reflects a

positive view of the examinations and the narrow factor reflects a negative impact (where the scores

are high in each case). Note, however, that the predictable factor does not necessarily reflect

problematic or negative aspects of predictability but it can reflect desired aspects of predictability as

well.

Page 16: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

16

Table 4. Views on predictabil ity: EFA solution ( loadings)

English

(n=749)

Biology

(n=536)

Geography

(n=387)

valuable predictable narrow valuable predictable narrow valuable predictable narrow

h) I think I will be able to use what I learned for

this exam in the future

0.56 0.50 0.45 0.34

d) To do well in this exam I need to think and

adapt what I know

0.52 0.58 0.56

c) The exam tests the right kind of learning 0.46 0.54 0.45

f) To do well in this exam, I need a broad

understanding of the subject, across many topics

0.44 0.39

b) To do well in this exam, remembering is more

important than understanding

-0.47 -0.39 -0.41

i) I predicted the exam questions well 0.66 0.58 0.66

a) I felt I knew what the examiners wanted this

year

0.50 0.60 0.48

e) I was surprised by the questions on the exam

this year

-0.40 -0.54 -0.50

g) I left a lot of topics out of my revision and still

think I will do well

0.99 0.75 0.99

q11f) I chose not to study some topics as I

thought they would not come

0.43 0.57 0.43

j) I can do well in this exam even if I do not fully

understand the topics

-0.31

Page 17: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

17

Table 5 reports alpha coefficients and explained variances for the proposed constructs. The three

constructs explained 48% of the total variance in the item data for the three subjects. Alpha coefficients

ranged between 0.53 and 0.62.

Table 5. Views on predictabil ity: Alpha coeffic ients and explained variance

English

(n=749)

Biology

(n=536)

Geography

(n=387)

value 0.62 0.53 0.55

predictability 0.53 0.61 0.58

narrow 0.61 0.61 0.59

% explained variance 48% 48% 48%

Figures 6, 7 and 8 show the distribution of the views on predictability scale for English, biology, and

geography. The valuable learning scale and the predictability scale are nicely distributed. The narrowing

the curriculum scale is less continuous and appears to be multimodal, reflecting that only two items

loaded on the scale.

Figure 6. Engl ish: distr ibution of views on predictabil ity scales

0.0

0.1

0.2

0.3

0.4

-2 -1 0 1 2

Valuable learning scale (EFA)

De

nsity

0.0

0.2

0.4

-2 -1 0 1

Predictability scale (EFA)

De

nsity

0.0

0.2

0.4

0.6

-1 0 1 2

Narrowing the curriculum scale (EFA)

De

nsity

Page 18: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

18

Figure 7. Biology: distr ibution of views on predictabil ity scales

Figure 8. Geography: distr ibution of views on predictabil ity scales

Learning support

Students were surveyed on the learning support they received for the leaving certificate examination

(see question 13, Appendix A). Question 13 included 14 items for each subject area on the different

kinds of support students received. Tests to determine the number of factors to extract were carried

out on the item data. The results are presented in Figure 9.

0.0

0.1

0.2

0.3

0.4

-2 -1 0 1 2

Valuable learning scale (EFA)

Den

sity

0.0

0.1

0.2

0.3

0.4

-1 0 1 2

Predictability scale (EFA)

Den

sity

0.0

0.1

0.2

0.3

0.4

0.5

-1 0 1 2

Narrowing the curriculum scale (EFA)

Den

sity

0.0

0.2

0.4

-2 -1 0 1

Valuable learning scale (EFA)

De

nsity

0.0

0.1

0.2

0.3

0.4

-2 -1 0 1 2

Predictability scale (EFA)

De

nsity

0.0

0.2

0.4

-1 0 1 2

Narrowing the curriculum scale (EFA)

De

nsity

Page 19: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

19

Figure 9. Learning support: tests to determine the number of factors

Geography (n=383)

Guided by the parallel analysis results, four latent constructs are identified for English and biology, and

three for geography. In unreported analysis the four-factor solution produced uninterpretable results

and the three-factor solution seemed to group students according to whether they received learning

support from the school (F1), from external sources such as the internet, parents and friends (F2), and

from grinds school (F3). Since it is not an objective of this paper to analyse learning support in depth, it

was decided to consider two dimensions of learning support only: school support and external support.

Rasch was preferred over EFA for scale development because it adapts better to the binary data of the

learning support items (ie received support or not). Items a, b, d, e and g were considered indicators of

school support and items c, f, h, i, j, k, l, m and n indicators of external support. A single learning

support scale using all the item data was also created.

Table 6 reports item weights resulting from Rasch analysis for the school learning support scale. In all

subjects item e (‘I was given past papers’) has the lowest weight. That is, most students reported that

they were given past papers. As the Rasch analyses were conducted separately for each subject, the

values of the estimates are not comparable across subjects in Tables 6 to 9.

English (n=746) Biology (n=541)

Page 20: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

20

Table 6. School learning support: i tem weights and standard errors

English

(n=746)

Biology

(n=542)

Geography

(n=384)

Estimate Std Error Estimate Std Error Estimate

Std

Error

b) Marking criteria were explained to me 0.47 (0.10) 0.02 (0.12) 0.08 (0.16)

d) Model answers were given to me 0.46 (0.10) 2.72 (0.14) 0.11 (0.16)

e) I was given past papers -1.12 (0.15) -2.58 (0.24) -0.86 (0.20)

g) The exam format was explained to me -0.78 (0.13) -0.83 (0.14) -0.45 (0.18)

Table 7 records outfit and infit statistics for constituent items of the school learning support scale. Fit

statistics were within acceptable ranges [0.8–1.2], except for item g and item b in biology, with values

lower than 0.7. There was overfit for these items; that is, the pattern of responses did not vary as much

as expected for the Rasch model, with most students ticking that they had received these kinds of

support. However, the inclusion of these items in the scale was not degrading for construct

development (<0.5).

Table 7. School learning support: i tem fit statistics

English

(n=746)

Biology

(n=542)

Geography

(n=384)

Outfit

MSQ

Infit

MSQ

Outfit

MSQ

Infit

MSQ

Outfit

MSQ

Infit

MSQ

a) Which topics were likely to come up

was explained to me

1.01 1.03 0.75 0.79 1.15 1.12

b) Marking criteria were explained to me 1.01 1.01 0.56 0.68 0.79 0.83

d) Model answers were given to me 1.07 1.05 1.58 0.85 1.09 1.07

e) I was given past papers 0.89 0.86 1.15 0.71 0.91 0.90

g) The exam format was explained to me 0.59 0.70 0.65 0.78 0.68 0.73

Tables 8 and 9 record item weights and fit statistics for the external learning support scale.

Page 21: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

21

Table 8. External learning support: i tem weights and standard errors

English

(n=746)

Biology

(n=541)

Geography

(n=384)

Estimate Std

Error

Estimate Std

Error

Estimate Std Error

f) I have textbooks to help with my study -2.60 (0.10) -3.82 (0.19) -3.82 (0.21)

h) I used revision guides -1.01 (0.08) -1.33 (0.10) -1.48 (0.12)

i) I looked at past papers on the internet -1.59 (0.08) -1.86 (0.11) -1.65 (0.12)

j) My parents helped me with my studies 0.61 (0.09) 1.45 (0.13) 1.08 (0.15)

k) Friends helped me to prepare for the

exams

-0.32 (0.08) -0.62 (0.10) -0.57 (0.12)

l) I used revision apps 1.46 (0.11) 1.50 (0.13) 1.53 (0.17)

m) I took one-to-one or small-group grinds 1.36 (0.11) 1.68 (0.13) 1.79 (0.18)

n) I attended a grinds school 2.10 (0.14) 2.39 (0.17) 2.79 (0.25)

The results in Table 8 show consistently across subjects that attending a grinds school (item n) was the

least frequent on the external support scale and having textbooks to help with the study (item f) the

most frequent.

Table 9. External learning support: i tem fit statistics

English

(n=746)

Biology

(n=541)

Geography

(n=384)

Outfit

MSQ

Infit

MSQ

Outfit

MSQ

Infit

MSQ

Outfit

MSQ

Infit

MSQ

c) I used material from grinds websites 0.93 0.97 0.70 0.81 0.72 0.81

f) I have textbooks to help with my study 1.13 0.97 1.40 0.72 0.63 0.77

h) I used revision guides 0.84 0.89 0.77 0.86 0.87 0.90

i) I looked at past papers on the internet 0.90 0.93 1.01 0.93 0.76 0.87

j) My parents helped me with my studies 0.82 0.91 0.78 0.97 0.89 0.99

k) Friends helped me to prepare for the exams 0.93 0.96 1.08 1.02 0.89 0.94

l) I used revision apps 0.67 0.82 0.54 0.77 0.84 0.83

m) I took one-to-one or small-group grinds 0.81 0.86 0.63 0.84 0.68 0.83

n) I attended a grinds school 0.71 0.88 0.86 0.84 0.65 0.85

Fit statistics for English were within acceptable ranges (see Table 9). Some items introduced misfit in

biology and geography. For example, fit statistics lower than 0.7 for items l and m in biology and item n

in geography indicated overfit to the Rasch model.

Tables 10 and 11 report item weights and fit statistics for the combined learning support scale. Item

weight estimates in Table 10 were quite consistent across the three subject areas. The last three items

of question 13 exerted the greatest weight on the learning support scale: attending a grinds schools (n),

taking one-to-one or small-group grinds (m), and using revision apps (l). These forms of support were

relatively least likely to be present for the student. Conversely, having obtained past papers (e), having

had the exam format explained (g), and having textbooks to help with learning (f) showed the weakest

weight.

Page 22: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

22

Table 10. Learning support scale: item weights and standard errors

English

(n=746)

Biology

(n=541)

Geography

(n=383)

Estimate Std Error Estimate Std Error Estimate Std Error

b) Marking criteria were explained to me -1.46 (0.10) -1.29 (0.11) -1.95 (0.16)

c) I used material from grinds websites 1.00 (0.08) 1.25 (0.10) 1.43 (0.13)

d) Model answers were given to me -1.47 (0.10) 1.02 (0.10) -1.95 (0.16)

e) I was given past papers. -2.81 (0.15) -3.26 (0.20) -2.82 (0.22)

f) I have textbooks to help with my study -1.50 (0.10) -2.88 (0.17) -2.55 (0.19)

g) The exam format was explained to me -2.52 (0.14) -1.99 (0.13) -2.40 (0.18)

h) I used revision guides 0.04 (0.08) -0.58 (0.10) -0.31 (0.12)

i) I looked at past papers on the internet -0.52 (0.08) -1.09 (0.10) -0.48 (0.12)

j) My parents helped me with my studies 1.63 (0.09) 2.06 (0.13) 2.20 (0.15)

k) Friends helped me to prepare for the

exams

0.73 (0.08) 0.08 (0.09) 0.56 (0.12)

l) I used revision apps 2.46 (0.12) 2.11 (0.13) 2.61 (0.17)

m) I took one-to-one or small-group grinds 2.37 (0.11) 2.29 (0.13) 2.86 (0.18)

n) I attended a grinds school 3.09 (0.15) 2.98 (0.17) 3.85 (0.26)

In general, fit statistics for the combined learning support scale were within acceptable ranges (see

Table 11).

Table 11. Learning support scale: Item fit statistics

English

(n=746)

Biology

(n=541)

Geography

(n=383)

Outfit

MSQ

Infit

MSQ

Outfit

MSQ

Infit

MSQ

Outfit

MSQ

Infit

MSQ

a) Which topics were likely to come up was explained

to me

0.97 0.92 0.89 0.96 0.89 0.97

b) Marking criteria were explained to me 0.81 0.88 0.80 0.89 1.53 0.86

c) I used material from grinds websites 1.08 1.02 0.96 0.91 1.02 0.89

d) Model answers were given to me 1.13 0.93 0.94 0.97 1.05 0.93

e) I was given past papers. 0.92 0.82 1.05 0.78 0.63 0.79

f) I have textbooks to help with my study 1.01 0.98 0.74 0.88 0.74 0.85

g) The exam format was explained to me 0.70 0.80 0.92 0.88 0.59 0.75

h) I used revision guides 0.89 0.93 0.82 0.90 0.88 0.92

i) I looked at past papers on the internet 0.89 0.94 0.92 0.95 0.91 0.98

j) My parents helped me with my studies 0.92 0.89 0.99 0.98 0.83 0.93

k) Friends helped me to prepare for the exams 1.07 0.98 1.00 0.98 0.88 0.93

l) I used revision apps 0.76 0.85 0.91 0.88 0.86 0.86

m) I took one-to-one or small-group grinds 0.81 0.86 0.84 0.87 0.94 0.84

n) I attended a grinds school 1.17 0.88 0.93 0.88 0.51 0.81

Page 23: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

23

Figure 10 presents the distribution of the learning support scale for the three subject areas.

Figure 10. Learning support scale: distr ibution

Family SES

The dichotomous data on home possessions and ordinal data on parental education and number of

books were summarised into a single family SES scale using the partial credit model. The model was

applied to all the sample of students, not the subject-specific samples. Item-weight estimates of final

SES models are presented in Table 12.

0.0

0.1

0.2

0.3

-5.0 -2.5 0.0 2.5 5.0

Learning support scale (IRT)

De

nsity

English

0.0

0.1

0.2

0.3

-2.5 0.0 2.5 5.0

Learning support scale (IRT)

De

nsity

Biology

0.0

0.1

0.2

0.3

-2.5 0.0 2.5 5.0

Learning support scale (IRT)

De

nsity

Geography

Page 24: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

24

Table 12. SES partial credit model : i tem weights and standard errors

SES items (n=919) Estimate Std Error

Mother's education: b) primary education -2.63 (0.35)

Mother's education: c) Lower secondary education (Junior/Inter Cert or

equivalent) -2.57 (0.33)

Mother's education: d) Upper secondary education (Leaving Cert or equivalent) -0.14 (0.34)

Mother's education: e) Post-secondary non-tertiary (eg PLC) 0.01 (0.32)

Mother's education: f) Non-degree (certificate/diploma) 1.02 (0.32)

Mother's education: g) Bachelor's degree 3.28 (0.34)

Mother's education: h) Postgraduate degree (Masters or Phd) 5.00 (0.38)

Father's education: a) did not go to school -2.57 (0.43)

Father's education: b) primary education -3.26 (0.40)

Father's education: c) Lower secondary education (Junior/Inter Cert or

equivalent) -2.25 (0.37)

Father's education: d) Upper secondary education (Leaving Cert or equivalent) 0.52 (0.38)

Father's education: e) Post-secondary non-tertiary (eg PLC) 0.25 (0.33)

Father's education: f) Non-degree (certificate/diploma) 1.50 (0.33)

Father's education: g) Bachelor's degree 3.20 (0.33)

Father's education: h) Postgraduate degree (Masters or Phd) 4.46 (0.34)

Home possessions: a) A TV -3.67 (0.31)

Home possessions: b) A car -2.09 (0.15)

Home possessions: c) A dishwasher -0.41 (0.10)

Home possessions: d) A room of your own -1.21 (0.11)

Home possessions: e) A quiet place to study -0.30 (0.09)

Home possessions: f) A computer or laptop you can use for school work -1.48 (0.12)

Home possessions: g) Internet access -2.28 (0.17)

Home possessions: h) An iPad or other tablet of your own 2.62 (0.10)

Home possessions: i) A smartphone (for example, iPhone, Blackberry, or Android)

of your own -0.03 (0.09)

Home possessions: j) A mobile phone of your own -1.30 (0.12)

Home possessions: k) A PlayStation, X-box, or Wii -0.35 (0.09)

Home possessions: l) Classic literature (for example, W.B. Yeats, James Joyce, or

Maria Edgeworth) 1.28 (0.09)

Home possessions: m) A dictionary -1.79 (0.14)

Number of books: (0 – 10 books) 0.04 (0.16)

Number of books: (26 – 100 books) -0.23 (0.17)

Number of books: (101 – 200 books) 1.04 (0.20)

Number of books: (201 – 500 books) 2.39 (0.24)

Number of books: (More than 500 books) 3.70 (0.29)

Page 25: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

25

The threshold parameters more or less consistently indicated greater weights for higher categories of

parental education and number of books. Item weights for the home possessions items indicated that

having an iPad exerts the greatest weight on SES and a TV the lowest weight. Item fit statistics are

presented in Table 13.

Table 13. SES partial credit model : i tem fit statistics

SES items (n=919) Outfit

MSQ

Infit

MSQ

Mother's education 0.78 0.78

Father's education 1.07 0.89

Home possessions: a) A TV 0.90 0.89

Home possessions: b) A car 0.83 0.92

Home possessions: c) A dishwasher 0.93 0.96

Home possessions: d) A room of your own 1.02 0.97

Home possessions: e) A quiet place to study 0.93 0.93

Home possessions: f) A computer or laptop you can use

for school work 0.79 0.87

Home possessions: g) Internet access 0.60 0.85

Home possessions: h) An iPad or other tablet of your

own 0.96 0.98

Home possessions: i) A smartphone (for example,

iPhone, Blackberry, or Android) of your own 1.07 1.02

Home possessions: j) A mobile phone of your own 1.11 1.02

Home possessions: k) A PlayStation, X-box, or Wii 1.01 1.00

Home possessions: l) Classic literature (for example,

W.B. Yeats, James Joyce, or Maria Edgeworth) 0.85 0.88

Home possessions: m) A dictionary 0.64 0.85

Number of books 0.89 0.87

Item fit statistics were within acceptable ranges, except for items g, ‘internet access’, and m, ‘a

dictionary’, with outfit values of 0.60 and 0.64, respectively.

Figure 11 presents the distribution of the SES scale.

Page 26: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

26

Figure 11. SES distr ibution

Analysis of research questions

Research question 3 – how predictable are examination questions in the

Leaving Certificate in Ireland?

This question is addressed with information reported by students on their experiences with the exam

and views on predictability (see question 12, Appendix A). Students were asked to report on a Likert

scale (ie strongly disagree, disagree, agree, strongly agree) their agreement with different statements

regarding the exam. Table 14 presents a summary of responses. Categories ‘agree’ and ‘strongly agree’

have been combined into a single category, ie ‘agree’. The percentage of the combined ‘agree’ category

is reported together with the total number of valid responses.

0.0

0.2

0.4

0.6

-2 -1 0 1 2 3

SES scale (IRT scores)

Den

sity

Page 27: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

27

Table 14. Views on the exam by subject area: percentage of agree (%) and val id

responses (n)

English Biology Geography

% n % n % n

a) I felt I knew what the examiners wanted this

year

63% 760 47% 544 58% 395

b) To do well in this exam, remembering is more

important than understanding

47% 760 55% 546 62% 395

c) The exam tests the right kind of learning 34% 760 45% 546 42% 396

d) To do well in this exam I need to think and

adapt what I know

82% 759 72% 546 80% 394

e) I was surprised by the questions on the exam

this year

32% 761 73% 546 49% 396

f) To do well in this exam, I need a broad

understanding of the subject, across many topics

69% 760 88% 548 84% 394

g) I left a lot of topics out of my revision and still

think I will do well

38% 762 29% 549 44% 395

h) I think I will be able to use what I learned for

this exam in the future

36% 761 72% 547 56% 395

i) I predicted the exam questions well 69% 760 31% 549 49% 395

j) I can do well in this exam even if I do not fully

understand the topics

37% 760 32% 548 42% 394

A considerable number of students reported they predicted the exam questions well. The percentages

varied by subject: 69% in English, 49% in geography and 31% in biology. Interestingly, a total of 72% of

the students reported they believed they will be able to use what they have learned for their exam in

the future in biology, whilst only 36% believed the same about English. In other words, there seems to

be positive aspects about the biology exam compared to the other subjects, if we judge it by students'

beliefs. This is again confirmed by only 32% of students who believe it is possible to do well on the

exam even if you do not fully understand the topic and 88% who agree with the statement ‘To do well

in this exam, I need a broad understanding of the subject, across many topics’. This is, again, the

highest reported agreement among the three subjects, indicating that the biology exam is less

predictable, examines a broad kind of understanding and is valued for the knowledge being useful for

the future.

Research question 4 – which aspects of this predictability are helpful and which

engender unwanted approaches to learning?

Different analyses are considered to address this question. One is factor analysis of the views on

predictability items presented before (see Table 4). Another is the association between the examination

scores and agreement with these items. The learning strategies items also contribute to addressing this

question. The association between the memorisation strategies and average examination scores is

presented, as well as the correlation between the learning strategies scales and the examination scores.

The factor solution of the views on predictability items produced three factors that we labelled

‘valuable’, ‘predictable’ and ‘narrow’ (see Table 4). The first factor reflected helpful aspects of

predictability, or that preparing for the exam is a valuable learning process. Grouped in this factor are

students who reported they will be able to use what they have learned for the future, that the exam

Page 28: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

28

tests the right kind of learning, and that a broad understanding of the subject is important to do well in

the exam. The second factor reflected views that the exam is predictable, for example, that it was

possible to predict exam questions well and that students were not surprised by exam questions. Unlike

the first factor that reflects valuable learning, it is not clear whether this factor reflects helpful or

unwanted aspects of predictability, as some level of predictability is expected and desired but

predictability due to memorisation strategies, for example, can be problematic for learning. The third

factor reflected views about narrowing the curriculum for exam preparation. For example, it shows the

extent to which students chose not to study some topics because they thought they would not come up

in the exam, and the number of students who left a lot of topics out of their revision and still think they

will do well.

Table 15 reports average exam scores for the views on predictability items for two combined

categories, ‘agree’ (ie ‘agree’ and ‘strongly agree’) and ‘disagree’ (ie ‘strongly disagree’ and ‘disagree’).

Scores are derived from the data on student grades using the Central Applications Office (CAO) scheme.

The scale of score points ranges from 0 to 100 for the higher level examination.

Page 29: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

29

Table 15. Views on predictabil ity and exam scores: average scores (M) and val id

responses (n)

English Biology Geography

Disagree Agree Disagree Agree Disagree Agree

Predictability scale

i) I predicted the exam questions well M 69.07 70.44 69.71 71.43 71.85 73.85

n 182 427 297 143 146 161

a) I felt I knew what the examiners

wanted this year

M 70.30 69.82 67.21 73.94 * 70.20 74.57

n 218 393 229 208 122 186

e) I was surprised by the questions on

the exam this year

M 70.89 68.02 69.96 70.31 73.86 71.77

n 419 192 113 325 158 150

j) I can do well in this exam even if I do

not fully understand the topics

M 70.76 68.82 69.76 71.28 71.98 74.07

n 376 234 291 149 172 134

Narrowing of the curriculum scale

g) I left a lot of topics out of my revision

and still think I will do well

M 71.52 67.73 * 72.56 64.89 * 73.91 71.78

n 364 247 309 131 161 146

11f) I chose not to study some topics as I

thought they would not come up

M 71.72 67.96 * 73.45 64.52 * 72.99 72.66

n 329 285 297 147 162 145

Valuable learning scale

h) I think I will be able to use what I

learned for this exam in the future

M 68.82 72.12 65.49 72.06 73.39 72.56

n 391 219 122 316 127 180

d) To do well in this exam I need to think

and adapt what I know

M 68.56 70.32 70.33 70.39 73.42 72.67

n 108 501 120 318 57 249

c) The exam tests the right kind of

learning

M 69.64 70.66 68.03 72.79 73.98 71.42

n 399 212 238 201 171 137

f) To do well in this exam, I need a broad

understanding of the subject, across

many topics

M 69.29 70.31 66.46 70.75 72.45 72.96

n 190 420 48 391 49 257

b) To do well in this exam, remembering

is more important than understanding

M 71.13 68.78 71.53 69.68 71.27 73.76

n 319 291 190 248 110 197

Note: * indicates statistically significant differences in mean scores between the ‘disagree’ and ‘agree’ combined

groups at 95% confidence interval using the Bonferroni correction for multiple comparison tests.

In English and biology, significant differences were found between students who agreed and disagreed

with the item ‘I left a lot of topics out of my revision and still think I will do well’, relating to the

narrowing of the curriculum scale. Differences are not statistically significant for geography. This item is

important in that it tells us whether students believe it is possible to narrow their reading before the

exam. It is obvious that the highest performing students in English and biology do not believe that is

possible. Similarly, students who agreed with the statement ‘I chose not to study some topics as I

Page 30: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

30

thought they would not come up’, performed significantly worse in the English and biology exam.

Narrowing the curriculum strategies thus seem to engender unwanted approaches to learning in

English and biology. It may be that the extent of question choice in geography means that the context

of narrowing the curriculum operated differently in that subject.

In biology and geography students scored higher if they agreed with the statement that they felt they

knew what the examiners wanted this year; the differences are statistically significant for biology, while

almost no differences were found among students who sat for the English exam.

Students who agreed with the statement ‘To do well in this exam, remembering is more important than

understanding’, in English and biology, scored lower than students who agreed to this statement in

geography, but differences are not statistically significant. Also, in general, students who agreed with

the statement ‘To do well in this exam, I need a broad understanding of the subject, across many

topics’ performed better, but again differences were not statistically significant.

Table 16 presents average exam scores for the memorisation strategies items for two combined

categories, ‘now and then’ (ie ‘almost never’ and ‘now and then’) and ‘often’ (ie ‘often’ and ‘always’).

Table 16. Memorisation strategies and exam scores: average scores (M) and sample

size (n)

English Biology Geography

Now and

then

Often Now and

then

Often Now and

then

Often

a) I tried to memorise all the

material that I was taught

M 71.11 68.80 62.08 72.92 * 68.55 74.72 *

n 316 296 106 339 93 214

e) I tried to learn my notes by

heart

M 70.56 69.33 67.84 71.58 72.71 73.04

n 330 282 148 297 107 199

k) I tried to memorise as

much of the revision material

as possible

M 71.01 69.51 65.25 71.63 69.05 73.82

n 237 376 79 365 63 245

m) I tried to memorise what I

thought was important

M 70.10 70.07 64.22 70.98 70.43 73.04

n 98 514 32 412 23 285

Note: * indicates statistically significant differences in mean scores between the ‘often’ and ‘now and then’

combined groups at 95% confidence interval using the Bonferroni correction for multiple comparison tests.

Students who declared they often memorised all the material they were taught scored significantly

higher in biology and geography than the rest. In general, memorisation strategies seem to be more

effective for performance in biology and geography, but differences are not statistically significant. In

English, students applying memorisation strategies tend to perform worse, but again, differences are

not statistically significant.

Table 17 reports correlation coefficients for the learning strategies scales and the exam scores in the

three subject areas.

Page 31: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

31

Table 17. Correlations between learning strategy scores and exam scores

Memorisation Elaboration Control

English -.05 -.05 .11*

Biology .15* .13* .36*

Geography .18* .03 .24*

* Correlation is significant at the 0.01 level (2-tailed)

The results are consistent with Table 16. The memorisation factor was positively related to the exam

scores for Biology and Geography but not for English. Additionally, the results in Table 17 show that

control strategies were even more important than memorisation strategies for obtaining higher scores

in the exam, especially in biology. Elaboration strategies, in contrast, are only related positively to exam

scores in biology.

From the literature it is expected that we will find lower correlation between memorisation strategies

and language scores, and higher correlation between control strategies and language scores (Donker et

al, 2014). The student differences found for biology and geography must also be understood based

upon the tasks area given in the Leaving Certificate, which asks students to recall and explain a number

of issues from the curriculum. Also, in theory it was expected that we would find stronger correlation

between control strategies and achievement than with memorisation strategies. Our study confirms

this, but it is also worth noting that for biology the correlation between achievement and control

strategy use is above r = 0.3, which is considered to be strong in strategy research. In PISA, control

strategy use and achievement is often found to be around r = 0.2 (March et al, 2006).

Research question 7 – what kinds of examination preparation strategies do

students use?

This question is addressed with information reported by students on the learning strategies they used

and the kinds of support they received for preparing for the exam.

Learning strategies

We showed in Table 2 that learning strategies can be grouped quite consistently across subjects in

three categories: memorisation, elaboration, and control strategies. Table 18 presents student

responses on their learning strategies for the exam. Students reported the frequency with which they

applied different learning strategies on a Likert scale (ie (1) almost never, (2) now and then, (3) often

and (4) always). Categories ‘often’ and ‘always’ have been combined into a single category, ‘often’. The

percentage of the combined ‘often’ category and the total number of valid responses are reported.

Learning strategies varied by subject. For example, students tend to use memorisation strategies more

often in biology and geography than in English. More than 80% of students tried to memorise as much

as possible of the revision material in biology and geography, while about 63% did it for the English

exam. Similarly, more than 60% of students tried to learn the notes by heart in biology and geography,

while 48% tried it for the English exam.

Interestingly, even though students reported that it is not possible to predict what will come up in the

biology exam, and the majority agree it assesses the right kind of learning, 77% of the students agreed

with the statement ‘I tried to memorise all the material that I was taught’. This percentage is 70% for

Page 32: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

32

geography and 49% for English. It can be argued that biology is a subject where students need to

memorise a lot of the material, and tasks are designed to assess whether they know factual knowledge.

The analysis of the biology exams in Ireland also revealed that students are asked to show they know

factual knowledge such as naming certain objects of a cell. In this respect, it is reasonable to use

memorisation strategies.

Table 18. Learning strategies: percentages of ‘often’ (%) and total val id responses

(n)

English Biology Geography

% n % n % n

Memorisation strategy

k) I tried to memorise as much of the revision material as possible 63% 762 83% 553 81% 396

e) I tried to learn my notes by heart 48% 763 68% 553 65% 393

a) I tried to memorise all the material that I was taught 49% 763 77% 554 70% 396

m) I tried to memorise what I thought was important 85% 762 94% 553 92% 397

Elaboration strategy

g) I figured out how the information might be useful in the real world 21% 763 54% 553 41% 394

c) I tried to relate new information to knowledge from other subjects 30% 762 53% 551 56% 392

h) I tried to understand the revision material better by relating it to

what I already knew 56% 764 70% 551 69% 396

n) I studied material that went beyond what is expected for the exam 18% 762 26% 552 17% 396

Control strategy

i) I made sure that I remembered the most important points in the

revision material 91% 764 92% 554 91% 395

d) I checked if I understood what I had read 80% 763 87% 552 84% 394

j) If I did not understand something, I looked for additional information

to clarify it 62% 764 75% 553 67% 396

l) I tried to figure out which ideas I had not really understood 51% 763 72% 551 59% 396

b) I started by figuring out exactly what I needed to learn 79% 764 80% 549 83% 395

Similarly, students seemed to use elaboration strategies more often for the biology and geography

exam than for the English exam. For example, 70% of students tried to relate the revision material to

what they already knew in the biology and geography exam, while 56% did it for the English exam. Also,

in biology and geography more than 50% of students tried to relate new information to knowledge

from other subjects, while 30% did it for English. Differences between subjects in the use of control

strategies are less pronounced.

Page 33: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

33

Learning support

Students received different kinds of support for preparing for the exam. Table 19 reports the

percentage of students who received support, by kind of support activity and the total number of valid

responses.

Table 19.Support for learning: percentages with support (%) and total val id

responses (n)

English Biology Geography

% n % n % n

a) Which topics were likely to come up was explained to

me 75% 748 67% 544 75% 384

b) Marking criteria were explained to me 81% 747 77% 542 87% 386

c) I used material from grinds websites 34% 750 27% 543 27% 387

d) Model answers were given to me 82% 747 31% 543 87% 386

e) I was given past papers 94% 748 95% 543 94% 385

f) I have textbooks to help with my study 82% 748 93% 542 92% 385

g) The exam format was explained to me 92% 747 86% 542 91% 385

h) I used revision guides 54% 747 64% 541 62% 386

i) I looked at past papers on the internet 66% 748 74% 542 65% 385

j) My parents helped me with my studies 23% 751 16% 545 17% 386

k) Friends helped me to prepare for the exams 39% 749 50% 543 44% 387

l) I used revision apps 13% 752 15% 544 12% 387

m) I took one-to-one or small-group grinds 14% 752 13% 545 11% 389

n) I attended a grinds school 8% 752 8% 545 5% 388

As can be seen from the table and was also discussed earlier, students were less likely to attend grinds

schools, take one-to-one or small-group grinds, and use revision apps compared with other kinds of

support. In contrast, the large majority of students were given past papers and report that the exam

format was explained to them. In general, there are no substantial differences between subjects.

However, important differences are found for item d, where only 31% of students report that model

answers were given to them for biology, while more than 80% do so for English and geography.

When it comes to support from family, only 23% report this in English, 16% in biology and 17% in

geography. It is more common to have support and help from friends. Here again we find subject

differences, and half of the biology students report to have had help from friends, with lower

percentages in the two other subjects.

Regression analysis

Regression analysis has been conducted for the examination scores and the predictability scales.

Regression results produce evidence of associations but results cannot be interpreted in terms of

causation.

Page 34: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

34

Examination scores model

Regressions of examination scores on family SES, gender, the learning strategies scales, the learning

support scales and the predictability scales are estimated stepwise for English (see Table 20), biology

(see Table 21) and geography (see Table 22).

The results indicate a positive association between the examination scores and family SES. The control

strategies scale is positively related to the exam scores in all three subjects even after controlling for

family SES. In biology and geography the memorisation strategy is also positively related to the exam

scores irrespective of family SES.

Page 35: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

35

Table 20. Regression model of Engl ish scores (unstandardised coeffic ients and standard errors)

Model 1: Background

(n= 579)

Model 2: Learning strategies

(n= 563)

Model 3: Learning support

(n= 408)

Model 4: Views on

predictability

(n=400)

Estimate Std Error Estimate Std Error Estimate Std Error Estimate Std Error

Intercept 62.83 (1.42) *** 63.84 (1.44) *** 64.52 (2.18) *** 65.37 (2.24) ***

Family SES 6.16 (1.03) *** 5.44 (1.04) *** 5.96 (1.24) *** 5.59 (1.27) ***

Female 1.98 (1.28) 1.73 (1.30) 2.56 (1.57) 2.53 (1.63)

Memorisation strategy scale -0.74 (0.71) -1.07 (0.86) -1.17 (0.86)

Elaboration strategy scale -1.49 (0.79) . -1.23 (0.96) -1.69 (1.01) .

Control strategy scale 1.82 (0.81) * 2.14 (0.95) * 1.43 (1.00)

School learning support scale -0.75 (0.68) -0.88 (0.70)

External learning support scale -0.52 (0.57) -0.38 (0.57)

Predictability scale 1.40 (1.01)

Valuable learning scale 1.55 (1.04)

Narrowing of the curriculum scale -2.08 (0.77) **

Significant codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Page 36: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

36

Table 21. Regression model of biology scores (unstandardised coeffic ients and standard errors)

Model 1: Background

(n=423)

Model 2: Learning strategies

(n=412)

Model 3: Learning support

(n=247)

Model 4: Views on

predictability (n=238)

Estimate Std Error Estimate Std Error Estimate Std Error Estimate Std Error

Intercept 59.15 (2.56) *** 62.40 (2.49) *** 60.38 (3.85) *** 58.88 (3.93) ***

Family SES 9.68 (1.74) *** 7.67 (1.66) *** 8.65 (2.24) *** 8.99 (2.24) ***

Female 2.90 (2.25) 1.77 (2.19) 2.81 (2.96) 3.88 (3.09)

Memorisation strategy scale 2.41 (1.14) * 1.83 (1.45) 1.15 (1.43)

Elaboration strategy scale 1.97 (1.27) 1.91 (1.64) -0.51 (1.87)

Control strategy scale 7.74 (1.31) *** 9.07 (1.73) *** 7.38 (1.75) ***

School learning support scale 0.26 (0.79) 0.38 (0.80)

External learning support scale -1.91 (0.90) * -2.49 (0.88) **

Predictability scale 4.21 (1.72) *

Valuable learning scale 2.88 (1.89)

Narrowing the curriculum scale -4.29 (1.74) *

Significant codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Page 37: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

37

Table 22. Regression model of geography scores (unstandardised coeffic ients and standard errors)

Model 1: Background

(n=294)

Model 2: Learning strategies

(n=283)

Model 3: Learning support

(n=198)

Model 4: Views on

predictability (n=196)

Estimate Std Error Estimate Std Error Estimate Std Error Estimate Std Error

Intercept 64.45 (1.74) *** 66.45 (1.59) *** 73.46 (2.21) *** 73.48 (2.20) ***

Family SES 7.85 (1.45) *** 6.03 (1.32) *** 3.72 (1.28) ** 2.97 (1.28) *

Female 3.42 (1.64) * 3.17 (1.50) * 1.62 (1.51) 2.23 (1.50)

Memorisation strategy scale 2.20 (0.92) * 2.53 (0.96) ** 2.03 (0.96) *

Elaboration strategy scale 0.12 (0.99) -0.02 (1.01) 0.68 (1.07)

Control strategy scale 3.29 (1.02) ** 2.73 (1.04) ** 2.97 (1.02) **

School learning support scale -0.23 (0.72) 0.05 (0.71)

External learning support scale -0.22 (0.47) 0.05 (0.47)

Predictability scale 2.16 (1.04) *

Valuable learning scale -2.81 (1.09) *

Narrowing the curriculum scale -1.32 (0.78) .

Significant codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Page 38: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

38

The narrowing the curriculum scale is negatively related to the exam scores, consistently across the

three subjects. There is also evidence that the predictability scale is positively related to the exam

scores in biology and geography. It is important to note that not only views on predictability can affect

performance in the exam but also exam results can influence views on predictability. One should

therefore be careful in interpreting the direction of causation in these results.

Predictability model

We now look at associations with students’ views on the three predictability scales. Regressions of the

views on the exam scales on family SES, gender and the examination scores are estimated for the

English (see Table 23), biology (see Table 24), and geography (see Table 25) samples. For ease of

interpretation coefficients of examination scores were multiplied by 100.

The results for the predictable scale indicate no significant association with family SES and, in all three

subjects, girls less often share views that the exam is predictable. There is a slight positive association

between the examination scores and the predictable scale for biology and geography. Family SES is

positively related to the valuable learning scale for English and negatively for geography. No association

with gender is found for the valuable learning scale. The narrowing the curriculum scale is not

significantly associated with family SES, but an association with gender is apparent in the three

subjects. In particular, girls are less likely to use narrowing the curriculum strategies. Also, the

examination scores are negatively associated with the narrowing the curriculum scale even after

controlling for family SES. That is, students who score higher in the exam tend to use narrowing the

curriculum strategies less often, independently of their family SES.

Page 39: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

39

Table 23. Engl ish: views on predictabil ity regression models (unstandardised coeffic ients and standard errors)

Predictable scale Valuable learning scale Narrowing the curriculum scale

Model 1 (n=697) Model 2 (n=564) Model 1 (n=697) Model 2 (n=564) Model 1 (n=697) Model 2 (n=564)

Estimate Std

Error

Estimate Std

Error

Estimate Std

Error

Estimate Std

Error

Estimate Std

Error

Estimate Std

Error

Intercept 0.10 (0.07) 0.05 (0.15) -0.16 (0.07) * -0.48 (0.16) ** 0.29 (0.09) ** 0.76 (0.20) ***

Family SES 0.10 (0.05) * 0.14 (0.05) ** 0.12 (0.05) * 0.07 (0.06) -0.08 (0.06) -0.05 (0.07)

Female -0.30 (0.06) *** -0.36 (0.07) *** 0.03 (0.06) 0.04 (0.07) -0.35 (0.08) *** -0.36 (0.09) ***

English exam score 0.08 (0.21) 0.51 (0.23) * -0.66 (0.28) *

Significant codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Table 24. Biology: views on predictabil ity regression models (unstandardised coeffic ients and standard errors)

Predictable scale Valuable learning scale Narrowing the curriculum scale

Model 1 (n=502) Model 2 (n=475) Model 1 (n=502) Model 2 (n=475) Model 1 (n=502) Model 2 (n=475)

Estimate Std

Error

Estimate Std Error Estimate Std Error Estimate Std Error Estimate Std Error Estimate Std

Error

Intercept 0.25 (0.09) ** 0.00 (0.14) 0.04 (0.09) -0.10 (0.11) 0.29 (0.09) ** 0.68 (0.12) ***

Family SES 0.00 (0.06) -0.04 (0.06) 0.00 (0.06) -0.03 (0.06) -0.11 (0.06) . 0.02 (0.06)

Female -0.36 (0.08) *** -0.41 (0.08) *** -0.07 (0.08) -0.08 (0.08) -0.26 (0.08) ** -0.29 (0.08) ***

Biology exam score 0.47 (0.18) ** 0.46 (0.18) -0.80 (0.18) ***

Significant. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Page 40: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

40

Table 25. Geography: views on predictabil ity regression models (unstandardised coeffic ients and standard errors)

Predictable scale Valuable learning scale Narrowing the curriculum scale

Model 1 (n=366) Model 2 (n=287) Model 1 (n=366) Model 2 (n=287) Model 1 (n=366) Model 2 (n=287)

Estimate Std

Error

Estimate Std

Error

Estimate Std

Error

Estimate Std

Error

Estimate Std

Error

Estimate Std

Error

Intercept 0.10 (0.09) -0.40 (0.23) 0.04 (0.09) 0.32 (0.24) 0.22 (0.11) * 0.60 (0.31)

Family SES 0.03 (0.07) -0.02 (0.08) -0.15 (0.07) * -0.14 (0.09) -0.09 (0.09) -0.10 (0.11)

Female -0.22 (0.08) ** -0.25 (0.09) ** 0.14 (0.08) . 0.16 (0.09) . -0.25 (0.11) * -0.29 (0.12) **

Geography exam score 0.82 (0.33) * -0.38 (0.35) -0.39 (0.44)

Significant codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Page 41: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

41

References

Allan, A (1997) Begging the questionnaire: instrument effect on readers’ response to a self-report

checklist. Language Testing, 12, 133–156

Cattell, R B (1966) The scree test for the number of factors. Multivariate Behavioral Research, 1, 245–

276

Courtney, M G (2013) Determining the number of factors to retain in EFA: using the SPSS R-Menu v2.0

to make more judicious estimations. Practical Assessment, Research & Evaluation, 18(8), 1–14

Donker, A S, Boer, H de, Kostons, D, Dignath van Ewijk, C C, & Werf, M P C van der (2014). Effectiveness

of learning strategy instruction on academic performance: a meta-analysis, Educational

Research Review, 11, 1–26

Fabrigar, L R, Wegener, D T, MacCallum, R C, & Strahan, E J (1999). Evaluating the use of exploratory

factor analysis in psychological research. Psychological Methods, 3, 272–299

Horn, J L (1965) A rationale and test for the number of factors in factor analysis. Psychometrika, 30,

179–185

Kaiser, H F (1960) The application of electronic computers to factor analysis. Educational &

Psychological Measurement, 20, 141–151

Karabenick, S A, Woolley, M E, Friedel, J M, Bridget, V, Blazevski, J, Bonney, C R, Groot, E D E (2007)

Cognitive processing of self-report items in educational research: Do they think what we mean?

Educational Psychologist, 42(3), 37–41

Marsh, H, Haug, K-T, Artelt, C, & Baumert, J (2006) OECD’s brief self-report measure of educational

psychology’s most useful affective constructs: cross-cultural, psychometric comparisons across

25 countries. International Journal of Testing, 6(4), 311–360

Masters, G N & Wright, B D (1997) The partial credit model. In W J van der Linden & R K Hambleton

(Eds), Handbook of Modern Item Response Theory. New York: Springer, 101–122

Pintrich, P R, Smith, D A F, Garcia, T., & McKeachie, W J (1991) A Manual for the Use for the Motivated

Strategies for Learning Questionnaire (MSLQ). Ann Arbor: University of Michigan, National

Center for Research to Improve Postsecondary Teaching and Learning

Raiche, G (2010) nFactors: an R package for parallel analysis and non graphical solutions to the Cattell

scree test. R package version 2.3.3

Raiche, G, Roipel, M, & Blais, J G (2006) Non Graphical Solutions for the Cattell’s Scree Test. Paper

presented at The International Annual Meeting of the Psychometric Society, Montreal.

Retrieved December 10, 2013, from http://www.er.uqam.ca/nobel/r17165/RECHERCHE

/COMMUNICATIONS/2006/IMPS/IMPS_PRESENTATION_2006.pdf

Rasch, G (1960) Probabilistic Models for Some Intelligence and Attainment Tests. Copenhagen,

Denmark: Nielsen and Lydiche

Page 42: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

42

Ruscio, J, & Roche, B (2012) Determining the number of factors to retain in an exploratory factor

analysis using comparison data of a known factorial structure. Psychological Assessment, 24(2),

282–292

Samuelstuen, M & Braten, I (2007) Examining the validity of self-reports on scales measuring students’

strategic processing. British Journal of Educational Psychology, 77, 351–378

Velicer, W F, Eaton, C A, & Fava, J L (2000) Construct explication through factor or component analysis:

a review and evaluation of alternative procedures for determining the number of factors or

components. In R D Goffin & E Helmes (Eds). Problems and Solutions in Human Assessment:

honoring Douglas N. Jackson at seventy. Boston, MA: Kluwer Academic, 41–71

Weinstein, C E, Husman, J, & Dierking, D R (2000) Self-Regulation Interventions with a Focus on

Learning Strategies. In M Boekaerts, P R Pintrich, & M Zeidner (Eds), Handbook of Self-

Regulation. Burlington: Elsevier Academic Press, 727–747

Weinstein, C E & Meyer, D K (1991) Cognitive learning strategies and college teaching. New Directions

for Teaching and Learning, 45, 15–26

Weinstein, C E & Palmer, D R (1990) LASSI-HS User’s Manual. Clearwater, FL: H&H Publishing

Weinstein, C E, Zimmermann, S A, & Palmer, D R (1988) Assessing learning strategies: the design and

development of the LASSI. In C E Weinstein, E T Goetz, & P A Alexander (Eds), Learning and

Study Strategies: issues in assessment, instruction and evaluation. San Diego, CA: Academic

Press, 25–39

Page 43: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

43

Appendix A: Questionnaire

Page 44: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

44

Page 45: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

45

Page 46: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

46

Page 47: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

47

Page 48: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

48

Page 49: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

49

Page 50: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

50

Page 51: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

51

Page 52: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

52

Page 53: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

53

Page 54: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

54

Appendix B: Summary Tables

Views on predictability

Overall results

Table A1. What were the views of students on the Engl ish exam? (percentages)

Strongly

disagree

Disagree Agree Strongly

agree

n

a) I felt I knew what the examiners wanted this year 7% 29% 52% 11% 760

b) To do well in this exam, remembering is more

important than understanding

19% 33% 22% 26% 760

c) The exam tests the right kind of learning 33% 32% 28% 6% 760

d) To do well in this exam I need to think and adapt what

I know

4% 13% 47% 36% 759

e) I was surprised by the questions on the exam this year 18% 50% 21% 11% 761

f) To do well in this exam, I need a broad understanding

of the subject, across many topics

7% 23% 45% 24% 760

g) I left a lot of topics out of my revision and still think I

will do well

20% 42% 31% 7% 762

h) I think I will be able to use what I learned for this exam

in the future

38% 26% 26% 10% 761

i) I predicted the exam questions well 7% 24% 47% 23% 760

j) I can do well in this exam even if I do not fully

understand the topics

21% 41% 30% 7% 760

Table A2. What were the views of students on the biology exam? (percentages)

Strongly

disagree

Disagree Agree Strongly

agree

n

a) I felt I knew what the examiners wanted this year 16% 36% 39% 9% 544

b) To do well in this exam, remembering is more

important than understanding

16% 27% 24% 32% 546

c) The exam tests the right kind of learning 29% 25% 35% 11% 546

d) To do well in this exam I need to think and adapt what

I know

6% 21% 44% 30% 546

e) I was surprised by the questions on the exam this year 6% 20% 35% 39% 546

f) To do well in this exam, I need a broad understanding

of the subject, across many topics

2% 8% 36% 54% 548

g) I left a lot of topics out of my revision and still think I

will do well

31% 39% 23% 6% 549

h) I think I will be able to use what I learned for this exam

in the future

12% 14% 40% 33% 547

i) I predicted the exam questions well 25% 44% 23% 9% 549

j) I can do well in this exam even if I do not fully

understand the topics

31% 36% 26% 7% 548

Page 55: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

55

Table A3. What were the views of students on the geography exam? (percentages)

Strongly

disagree

Disagree Agree Strongly

agree

n

a) I felt I knew what the examiners wanted this year 6% 34% 44% 16% 395

b) To do well in this exam, remembering is more

important than understanding

12% 25% 28% 35% 395

c) The exam tests the right kind of learning 30% 27% 32% 11% 396

d) To do well in this exam I need to think and adapt

what I know

5% 13% 49% 32% 394

e) I was surprised by the questions on the exam this

year

9% 41% 31% 18% 396

f) To do well in this exam, I need a broad

understanding of the subject, across many topics

4% 11% 40% 46% 394

g) I left a lot of topics out of my revision and still think I

will do well

19% 35% 35% 11% 395

h) I think I will be able to use what I learned for this

exam in the future

18% 25% 45% 12% 395

i) I predicted the exam questions well 14% 36% 35% 15% 395

j) I can do well in this exam even if I do not fully

understand the topics

21% 37% 32% 11% 394

Page 56: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

56

Views on the exam by gender

Table A4. Views on the Engl ish exam by gender (percentages)

Females Males

Strongly

disagree

Disagree Agree Strongly

agree

n Strongly

disagree

Disagree Agree Strongly

agree

n

a) I felt I knew what

the examiners wanted

this year

8% 33% 49% 10% 498 5% 22% 58% 15% 261

b) To do well in this

exam, remembering is

more important than

understanding

19% 33% 23% 25% 499 19% 33% 21% 27% 260

c) The exam tests the

right kind of learning

35% 32% 28% 5% 498 30% 33% 30% 7% 261

d) To do well in this

exam I need to think

and adapt what I

know

4% 12% 45% 39% 497 5% 14% 50% 31% 261

e) I was surprised by

the questions on the

exam this year

16% 48% 23% 12% 499 21% 52% 18% 9% 261

f) To do well in this

exam, I need a broad

understanding of the

subject, across many

topics

7% 22% 45% 26% 499 8% 23% 46% 22% 260

g) I left a lot of topics

out of my revision and

still think I will do well

23% 44% 28% 4% 500 15% 36% 38% 11% 261

h) I think I will be able

to use what I learned

for this exam in the

future

40% 26% 25% 9% 499 34% 25% 28% 13% 261

i) I predicted the exam

questions well

8% 27% 47% 18% 498 5% 18% 48% 30% 261

j) I can do well in this

exam even if I do not

fully understand the

topics

23% 43% 28% 6% 498 17% 38% 34% 10% 261

Page 57: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

57

Table A5. Views on the biology exam by gender (percentages)

Females Males

Strongly

disagree

Disagree Agree Strongly

agree

n Strongly

disagree

Disagree Agree Strongly

agree

n

a) I felt I knew what

the examiners wanted

this year

18% 38% 37% 8% 381 11% 34% 43% 12% 163

b) To do well in this

exam, remembering is

more important than

understanding

16% 26% 24% 33% 382 16% 30% 23% 30% 164

c) The exam tests the

right kind of learning

33% 27% 32% 9% 381 21% 21% 41% 17% 165

d) To do well in this

exam I need to think

and adapt what I

know

4% 19% 45% 32% 382 9% 24% 41% 26% 164

e) I was surprised by

the questions on the

exam this year

5% 18% 33% 44% 382 10% 23% 41% 27% 164

f) To do well in this

exam, I need a broad

understanding of the

subject, across many

topics

2% 6% 36% 56% 383 4% 12% 36% 48% 165

g) I left a lot of topics

out of my revision and

still think I will do well

33% 43% 20% 5% 384 26% 32% 33% 10% 165

h) I think I will be able

to use what I learned

for this exam in the

future

13% 16% 38% 33% 382 11% 12% 44% 34% 165

i) I predicted the exam

questions well

27% 46% 19% 8% 384 19% 40% 32% 10% 165

j) I can do well in this

exam even if I do not

fully understand the

topics

32% 39% 24% 6% 383 30% 30% 30% 10% 165

Page 58: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

58

Table A6. Views on the geography exam by gender (percentages)

Females Males

Strongly

disagree

Disagree Agree Strongly

agree

n Strongly

disagree

Disagree Agree Strongly

agree

n

a) I felt I knew what

the examiners wanted

this year

8% 38% 38% 15% 225 4% 28% 51% 18% 170

b) To do well in this

exam, remembering is

more important than

understanding

12% 27% 27% 34% 226 12% 21% 30% 37% 169

c) The exam tests the

right kind of learning

30% 30% 32% 8% 226 30% 24% 32% 14% 170

d) To do well in this

exam I need to think

and adapt what I

know

5% 11% 45% 39% 224 6% 15% 55% 24% 170

e) I was surprised by

the questions on the

exam this year

8% 39% 31% 23% 226 11% 44% 32% 12% 170

f) To do well in this

exam, I need a broad

understanding of the

subject, across many

topics

3% 8% 38% 51% 225 5% 14% 42% 39% 169

g) I left a lot of topics

out of my revision and

still think I will do well

20% 41% 32% 7% 226 18% 28% 38% 16% 169

h) I think I will be able

to use what I learned

for this exam in the

future

19% 25% 43% 13% 226 17% 25% 46% 12% 169

i) I predicted the exam

questions well

16% 40% 30% 15% 226 12% 31% 41% 16% 169

j) I can do well in this

exam even if I do not

fully understand the

topics

25% 37% 28% 10% 226 15% 36% 37% 12% 168

Page 59: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

59

Views on the exam and family SES

Table A7. Views on the exam and family SES (correlation coeffic ients (Rho))

English Biology Geography

Rho n Rho n Rho n

a) I felt I knew what the examiners wanted this year 0.02 706 0.04 508 0.03 371

b) To do well in this exam, remembering is more important than

understanding

-0.08 706 0.05 510 0.11 372

c) The exam tests the right kind of learning -0.02 707 -0.03 511 -0.08 372

d) To do well in this exam I need to think and adapt what I know 0.03 705 -0.04 510 -0.13 370

e) I was surprised by the questions on the exam this year -0.07 707 0.04 511 0.07 372

f) To do well in this exam, I need a broad understanding of the

subject, across many topics

0.04 706 0.04 512 -0.04 372

g) I left a lot of topics out of my revision and still think I will do

well

-0.04 707 -0.06 512 -0.06 372

h) I think I will be able to use what I learned for this exam in the

future

0.14 706 0.11 510 0.07 372

i) I predicted the exam questions well 0.08 705 0.00 512 -0.02 372

j) I can do well in this exam even if I do not fully understand the

topics

-0.06 706 0.00 512 0.09 371

Views on the exam and exam results

Table A8. Engl ish: views on the exam and exam results (mean scores (M) and

sample size (n))

Strongly

disagree

Disagree Agree Strongly

agree

a) I felt I knew what the examiners wanted this year M 69.88 70.40 69.63 70.75

n 43 175 326 67

b) To do well in this exam, remembering is more important

than understanding

M 73.01 70.02 68.81 68.75

n 118 201 135 156

c) The exam tests the right kind of learning M 68.54 70.76 70.00 73.68

n 202 197 174 38

d) To do well in this exam I need to think and adapt what I

know

M 73.13 67.26 69.00 72.05

n 24 84 284 217

e) I was surprised by the questions on the exam this year M 70.45 71.06 68.69 66.74

n 112 307 126 66

f) To do well in this exam, I need a broad understanding of

the subject, across many topics

M 67.50 69.86 70.68 69.63

n 46 144 271 149

g) I left a lot of topics out of my revision and still think I will

do well

M 72.64 70.97 67.24 69.79

n 121 243 199 48

h) I think I will be able to use what I learned for this exam in

the future

M 68.71 68.99 71.73 73.10

n 232 159 156 63

i) I predicted the exam questions well M 66.84 69.65 70.84 69.65

n 38 144 285 142

j) I can do well in this exam even if I do not fully understand

the topics

M 72.54 69.86 69.43 66.60

n 126 250 184 50

Page 60: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

60

Table A9. Biology: views on the exam and exam results (mean scores (M) and

sample size (n))

Strongly

disagree

Disagree Agree Strongly

agree

a) I felt I knew what the examiners wanted this year M 60.51 70.03 73.13 77.14

n 68 161 166 42

b) To do well in this exam, remembering is more important

than understanding

M 72.69 70.92 72.09 67.97

n 65 125 103 145

c) The exam tests the right kind of learning M 68.44 67.57 73.12 71.59

n 125 113 157 44

d) To do well in this exam I need to think and adapt what I

know

M 71.88 69.95 69.85 71.25

n 24 96 194 124

e) I was surprised by the questions on the exam this year M 67.68 70.71 71.21 69.42

n 28 85 161 164

f) To do well in this exam, I need a broad understanding of

the subject, across many topics

M 66.00 66.58 69.11 71.87

n 10 38 158 233

g) I left a lot of topics out of my revision and still think I will

do well

M 74.62 71.06 64.90 64.83

n 130 179 102 29

h) I think I will be able to use what I learned for this exam in

the future

M 60.36 69.70 69.66 75.11

n 55 67 177 139

i) I predicted the exam questions well M 69.17 70.03 71.48 71.32

n 108 189 105 38

j) I can do well in this exam even if I do not fully understand

the topics

M 70.19 69.40 72.03 68.39

n 132 159 118 31

Table A10. Geography: views on the exam and exam results (mean scores (M) and

sample size (n))

Strongly

disagree

Disagree Agree Strongly

agree

a) I felt I knew what the examiners wanted this year M 67.78 70.63 72.93 78.68

n 18 104 133 53

b) To do well in this exam, remembering is more

important than understanding

M 72.08 70.88 73.10 74.27

n 36 74 87 110

c) The exam tests the right kind of learning M 74.43 73.51 73.03 67.24

n 87 84 99 38

d) To do well in this exam I need to think and adapt what I

know

M 78.67 71.55 72.68 72.66

n 15 42 157 92

e) I was surprised by the questions on the exam this year M 77.32 73.12 71.25 72.59

n 28 130 92 58

f) To do well in this exam, I need a broad understanding of

the subject, across many topics

M 72.08 72.57 72.54 73.29

n 12 37 114 143

g) I left a lot of topics out of my revision and still think I

will do well

M 74.70 73.56 71.02 73.95

n 50 111 108 38

h) I think I will be able to use what I learned for this exam

in the future

M 71.25 74.87 73.49 69.39

n 52 75 139 41

i) I predicted the exam questions well M 73.03 71.44 73.15 75.40

n 38 108 111 50

j) I can do well in this exam even if I do not fully

understand the topics

M 70.61 72.65 74.35 73.24

n 57 115 100 34

Page 61: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

61

Learning strategies

Overall results

Table A11. How did students prepare for the Engl ish exam? (percentages)

Almost

never

Now

and then

Often Always n

a) I tried to memorise all the material that I was taught 15% 36% 33% 16% 763

b) I started by figuring out exactly what I needed to learn 5% 16% 36% 43% 764

c) I tried to relate new information to knowledge from other

subjects

41% 28% 15% 15% 762

d) I checked if I understood what I had read 5% 15% 34% 46% 763

e) I tried to learn my notes by heart 24% 29% 25% 22% 763

f) I chose not to study some topics as I thought they would

not come up

27% 27% 26% 19% 765

g) I figured out how the information might be useful in the

real world

55% 24% 13% 8% 763

h) I tried to understand the revision material better by

relating it to what I already knew

18% 26% 35% 21% 764

i) I made sure that I remembered the most important points

in the revision material

2% 8% 36% 54% 764

j) If I did not understand something, I looked for additional

information to clarify it

11% 27% 31% 32% 764

k) I tried to memorise as much of the revision material as

possible

10% 26% 35% 28% 762

l) I tried to figure out which ideas I had not really understood 15% 34% 34% 17% 763

m) I tried to memorise what I thought was important 3% 12% 35% 49% 762

n) I studied material that went beyond what is expected for

the exam

56% 26% 10% 9% 762

Page 62: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

62

Table A12. How did students prepare for the biology exam? (percentages)

Almost

never

Now

and then

Often Always n

a) I tried to memorise all the material that I was taught 4% 19% 34% 43% 554

b) I started by figuring out exactly what I needed to learn 4% 16% 29% 50% 549

c) I tried to relate new information to knowledge from other

subjects

21% 26% 28% 24% 551

d) I checked if I understood what I had read 3% 10% 30% 57% 552

e) I tried to learn my notes by heart 13% 19% 29% 39% 553

f) I chose not to study some topics as I thought they would

not come up

41% 28% 17% 14% 553

g) I figured out how the information might be useful in the

real world

26% 21% 30% 23% 553

h) I tried to understand the revision material better by

relating it to what I already knew

9% 21% 38% 31% 551

i) I made sure that I remembered the most important points

in the revision material

1% 7% 25% 67% 554

j) If I did not understand something, I looked for additional

information to clarify it

7% 18% 28% 47% 553

k) I tried to memorise as much of the revision material as

possible

5% 12% 33% 50% 553

l) I tried to figure out which ideas I had not really understood 8% 20% 39% 33% 551

m) I tried to memorise what I thought was important 1% 5% 29% 64% 553

n) I studied material that went beyond what is expected for

the exam

51% 22% 14% 12% 552

Table A13. How did students prepare for the geography exam? (percentages)

Almost

never

Now and

then

Often Always n

a) I tried to memorise all the material that I was taught 8% 22% 33% 37% 396

b) I started by figuring out exactly what I needed to learn 5% 13% 33% 50% 395

c) I tried to relate new information to knowledge from

other subjects

19% 25% 28% 28% 392

d) I checked if I understood what I had read 5% 11% 38% 46% 394

e) I tried to learn my notes by heart 11% 23% 30% 35% 393

f) I chose not to study some topics as I thought they would

not come up

23% 31% 23% 24% 395

g) I figured out how the information might be useful in the

real world

32% 27% 26% 15% 394

h) I tried to understand the revision material better by

relating it to what I already knew

10% 21% 39% 30% 396

i) I made sure that I remembered the most important

points in the revision material

1% 8% 29% 62% 395

j) If I did not understand something, I looked for additional

information to clarify it

10% 24% 34% 33% 396

k) I tried to memorise as much of the revision material as

possible

4% 16% 39% 42% 396

l) I tried to figure out which ideas I had not really

understood

10% 31% 42% 18% 396

m) I tried to memorise what I thought was important 1% 7% 34% 58% 397

n) I studied material that went beyond what is expected for

the exam

59% 23% 13% 5% 396

Page 63: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

63

Learning strategies by gender

Table A14. Engl ish: learning strategies by gender (percentages)

Females Males

Almost

never

Now

and

then

Often Always n Almost

never

Now

and

then

Often Always n

a) I tried to memorise all

the material that I was

taught

13% 34% 33% 20% 500 18% 39% 34% 10% 262

b) I started by figuring

out exactly what I needed

to learn

5% 14% 34% 47% 499 5% 19% 39% 37% 264

c) I tried to relate new

information to

knowledge from other

subjects

40% 28% 16% 16% 499 45% 27% 15% 13% 262

d) I checked if I

understood what I had

read

4% 15% 33% 48% 498 7% 17% 35% 41% 264

e) I tried to learn my

notes by heart

23% 27% 26% 24% 499 25% 33% 24% 18% 263

f) I chose not to study

some topics as I thought

they would not come up

32% 28% 25% 15% 500 20% 24% 29% 27% 264

g) I figured out how the

information might be

useful in the real world

55% 25% 13% 7% 499 55% 23% 13% 10% 263

h) I tried to understand

the revision material

better by relating it to

what I already knew

17% 24% 36% 23% 500 19% 31% 32% 18% 263

i) I made sure that I

remembered the most

important points in the

revision material

1% 7% 34% 57% 500 3% 9% 40% 49% 263

j) If I did not understand

something, I looked for

additional information to

clarify it

9% 27% 31% 32% 500 14% 26% 30% 30% 263

k) I tried to memorise as

much of the revision

material as possible

9% 24% 33% 34% 498 14% 30% 39% 18% 263

l) I tried to figure out

which ideas I had not

really understood

15% 34% 34% 17% 499 14% 36% 33% 16% 263

m) I tried to memorise

what I thought was

important

2% 11% 35% 52% 499 5% 15% 35% 45% 262

n) I studied material that

went beyond what is

expected for the exam

57% 25% 9% 9% 498 54% 27% 11% 8% 263

Page 64: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

64

Table A15. Biology: learning strategies by gender (percentages)

Females Males

Almost

never

Now

and

then

Often Always n Almost

never

Now

and

then

Often Always n

a) I tried to memorise all

the material that I was

taught

3% 19% 34% 45% 385 8% 18% 36% 38% 169

b) I started by figuring

out exactly what I needed

to learn

4% 13% 30% 52% 381 4% 23% 28% 46% 168

c) I tried to relate new

information to

knowledge from other

subjects

21% 28% 28% 23% 383 22% 20% 30% 28% 168

d) I checked if I

understood what I had

read

2% 9% 31% 58% 383 5% 14% 27% 54% 169

e) I tried to learn my

notes by heart

11% 18% 28% 43% 384 17% 21% 33% 29% 169

f) I chose not to study

some topics as I thought

they would not come up

42% 28% 18% 11% 384 38% 27% 15% 20% 169

g) I figured out how the

information might be

useful in the real world

28% 21% 28% 23% 384 20% 20% 35% 25% 169

h) I tried to understand

the revision material

better by relating it to

what I already knew

10% 20% 38% 32% 384 7% 23% 40% 30% 167

i) I made sure that I

remembered the most

important points in the

revision material

1% 6% 24% 69% 385 2% 9% 28% 62% 169

j) If I did not understand

something, I looked for

additional information to

clarify it

6% 16% 30% 48% 384 9% 21% 24% 46% 169

k) I tried to memorise as

much of the revision

material as possible

4% 11% 31% 55% 384 9% 14% 38% 40% 169

l) I tried to figure out

which ideas I had not

really understood

6% 21% 39% 34% 383 13% 20% 37% 31% 168

m) I tried to memorise

what I thought was

important

1% 4% 29% 66% 384 2% 7% 31% 60% 169

n) I studied material that

went beyond what is

expected for the exam

56% 22% 13% 9% 383 41% 24% 17% 18% 169

Page 65: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

65

Table A16. Geography: learning strategies by gender (percentages)

Females Males

Almost

never

Now

and

then

Often Always n Almost

never

Now

and

then

Often Always n

a) I tried to memorise all

the material that I was

taught

7% 23% 32% 37% 227 8% 21% 35% 36% 169

b) I started by figuring

out exactly what I needed

to learn

4% 12% 32% 51% 225 5% 13% 33% 49% 170

c) I tried to relate new

information to

knowledge from other

subjects

18% 23% 29% 30% 223 21% 27% 27% 24% 169

d) I checked if I

understood what I had

read

4% 11% 38% 47% 224 6% 12% 37% 45% 170

e) I tried to learn my

notes by heart

10% 24% 30% 36% 224 13% 22% 30% 34% 169

f) I chose not to study

some topics as I thought

they would not come up

24% 32% 20% 23% 226 21% 29% 25% 24% 169

g) I figured out how the

information might be

useful in the real world

34% 25% 26% 15% 225 28% 30% 25% 17% 169

h) I tried to understand

the revision material

better by relating it to

what I already knew

13% 18% 39% 29% 226 6% 24% 39% 31% 170

i) I made sure that I

remembered the most

important points in the

revision material

1% 6% 27% 67% 225 2% 11% 32% 56% 170

j) If I did not understand

something, I looked for

additional information to

clarify it

8% 24% 34% 34% 226 11% 24% 34% 31% 170

k) I tried to memorise as

much of the revision

material as possible

4% 13% 38% 46% 226 4% 20% 40% 36% 170

l) I tried to figure out

which ideas I had not

really understood

10% 32% 39% 19% 227 11% 28% 46% 15% 169

m) I tried to memorise

what I thought was

important

0% 6% 33% 60% 227 3% 8% 34% 55% 170

n) I studied material that

went beyond what is

expected for the exam

65% 22% 9% 4% 226 52% 26% 17% 5% 170

Page 66: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

66

Learning strategies and family SES

Table A17. Learning strategies and family SES (correlation coeffic ients (Rho))

English Biology Geography

Rho n Rho n Rho n

a) I tried to memorise all the material that I was taught -0.02 709 0.05 518 0.06 372

b) I started by figuring out exactly what I needed to learn 0.03 710 0.01 513 0.02 371

c) I tried to relate new information to knowledge from other

subjects

0.10 708 0.00 515 0.03 368

d) I checked if I understood what I had read 0.08 709 0.14 516 0.11 370

e) I tried to learn my notes by heart -0.06 709 0.01 517 -0.02 370

f) I chose not to study some topics as I thought they would not

come up

-0.04 711 -0.07 517 0.02 371

g) I figured out how the information might be useful in the real

world

0.05 709 0.08 517 0.01 370

h) I tried to understand the revision material better by relating

it to what I already knew

0.00 710 -0.01 515 -0.01 372

i) I made sure that I remembered the most important points in

the revision material

0.07 710 -0.01 518 0.06 372

j) If I did not understand something, I looked for additional

information to clarify it

0.11 710 0.13 517 0.05 372

k) I tried to memorise as much of the revision material as

possible

-0.09 708 -0.04 517 0.08 372

l) I tried to figure out which ideas I had not really understood 0.06 709 0.08 515 0.05 372

m) I tried to memorise what I thought was important -0.02 708 0.03 517 -0.04 373

n) I studied material that went beyond what is expected for

the exam

0.11 709 0.08 517 0.01 373

Page 67: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

67

Learning strategies and exam results

Table A18. Engl ish: learning strategies and exam results (mean scores (M) and

sample size (n))

Almost

never

Now and

then

Often Always

a) I tried to memorise all the material that I was taught M 72.24 70.61 67.86 70.94

n 96 220 206 90

b) I started by figuring out exactly what I needed to learn M 67.12 70.10 68.77 71.27

n 33 104 212 264

c) I tried to relate new information to knowledge from other

subjects

M 69.54 70.83 70.37 69.33

n 260 168 95 89

d) I checked if I understood what I had read M 69.22 68.18 69.55 70.95

n 32 96 199 285

e) I tried to learn my notes by heart M 72.08 69.23 69.49 69.13

n 154 176 156 126

f) I chose not to study some topics as I thought they would

not come up

M 73.41 70.03 67.66 68.35

n 164 165 158 127

g) I figured out how the information might be useful in the

real world

M 70.36 70.57 68.33 68.44

n 336 150 78 48

h) I tried to understand the revision material better by

relating it to what I already knew

M 73.18 68.35 69.31 70.31

n 110 161 211 131

i) I made sure that I remembered the most important points

in the revision material

M 70.83 68.57 69.01 71.02

n 12 49 223 329

j) If I did not understand something, I looked for additional

information to clarify it

M 67.86 68.92 70.05 71.99

n 70 166 189 188

k) I tried to memorise as much of the revision material as

possible

M 73.62 69.94 68.41 70.86

n 69 168 207 169

l) I tried to figure out which ideas I had not really understood M 70.54 70.27 68.60 72.26

n 93 207 207 106

m) I tried to memorise what I thought was important M 68.42 70.51 69.51 70.47

n 19 79 216 298

n) I studied material that went beyond what is expected for

the exam

M 69.39 70.51 72.03 71.23

n 342 158 59 53

Page 68: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

68

Table A19. Biology: learning strategies and exam results (mean scores (M) and

sample size (n))

Almost

never

Now and

then

Often Always

a) I tried to memorise all the material that I was taught M 64.09 61.55 69.05 76.30

n 22 84 158 181

b) I started by figuring out exactly what I needed to learn M 72.50 62.81 69.76 73.07

n 18 73 125 225

c) I tried to relate new information to knowledge from other

subjects

M 68.23 68.93 71.72 72.45

n 93 112 128 110

d) I checked if I understood what I had read M 55.91 60.00 63.50 76.52

n 11 48 133 253

e) I tried to learn my notes by heart M 73.23 63.95 68.11 74.36

n 62 86 132 165

f) I chose not to study some topics as I thought they would

not come up

M 78.13 66.65 67.13 60.75

n 176 121 87 60

g) I figured out how the information might be useful in the

real world

M 64.95 70.64 72.61 73.56

n 111 94 138 101

h) I tried to understand the revision material better by

relating it to what I already knew

M 65.13 65.26 70.94 74.93

n 39 95 171 137

i) I made sure that I remembered the most important points

in the revision material

M 56.67 57.58 67.37 73.18

n 6 33 112 294

j) If I did not understand something, I looked for additional

information to clarify it

M 50.48 62.90 71.76 75.64

n 31 81 122 210

k) I tried to memorise as much of the revision material as

possible

M 70.80 62.69 66.54 75.31

n 25 54 153 212

l) I tried to figure out which ideas I had not really understood M 52.00 65.90 71.02 77.20

n 35 89 171 148

m) I tried to memorise what I thought was important M 59.00 65.19 68.22 72.33

n 5 27 135 277

n) I studied material that went beyond what is expected for

the exam

M 68.50 71.58 73.62 73.16

n 226 95 65 57

Page 69: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

69

Table A20. Geography: learning strategies and exam results (mean scores (M) and

sample size (n))

Almost

never

Now and

then

Often Always

a) I tried to memorise all the material that I was taught M 66.90 69.03 73.66 75.66

n 21 72 101 113

b) I started by figuring out exactly what I needed to learn M 71.43 70.16 71.52 74.55

n 14 32 105 156

c) I tried to relate new information to knowledge from other

subjects

M 70.93 72.08 73.57 74.44

n 59 65 91 90

d) I checked if I understood what I had read M 68.85 67.42 73.35 74.66

n 13 33 112 148

e) I tried to learn my notes by heart M 72.65 72.74 71.63 74.18

n 34 73 89 110

f) I chose not to study some topics as I thought they would

not come up

M 73.82 72.39 72.60 72.71

n 68 94 73 72

g) I figured out how the information might be useful in the

real world

M 73.68 74.94 72.13 69.57

n 95 89 75 46

h) I tried to understand the revision material better by

relating it to what I already knew

M 72.32 70.96 73.33 74.19

n 28 57 123 99

i) I made sure that I remembered the most important points

in the revision material

M 68.33 67.04 70.51 74.84

n 3 27 88 189

j) If I did not understand something, I looked for additional

information to clarify it

M 71.55 70.64 73.33 75.00

n 29 70 111 97

k) I tried to memorise as much of the revision material as

possible

M 70.42 68.73 71.71 75.84

n 12 51 120 125

l) I tried to figure out which ideas I had not really understood M 68.57 71.98 73.45 75.27

n 28 91 132 56

m) I tried to memorise what I thought was important M 66.67 71.00 71.50 73.90

n 3 20 103 182

n) I studied material that went beyond what is expected for

the exam

M 73.57 70.89 75.25 67.19

n 178 73 40 16

Page 70: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

70

Learning support

Overall results

Table A21. What kinds of support for learning did students have? (percentages)

English Biology Geography

No Yes n No Yes n No Yes n

a) Which topics were likely to come up was explained

to me

25% 75% 748 33% 67% 544 25% 75% 384

b) Marking criteria were explained to me 19% 81% 747 23% 77% 542 13% 87% 386

c) I used material from grinds websites 66% 34% 750 73% 27% 543 73% 27% 387

d) Model answers were given to me 18% 82% 747 69% 31% 543 13% 87% 386

e) I was given past papers 6% 94% 748 5% 95% 543 6% 94% 385

f) I have textbooks to help with my study 18% 82% 748 7% 93% 542 8% 92% 385

g) The exam format was explained to me 8% 92% 747 14% 86% 542 9% 91% 385

h) I used revision guides 46% 54% 747 36% 64% 541 38% 62% 386

i) I looked at past papers on the internet 34% 66% 748 26% 74% 542 35% 65% 385

j) My parents helped me with my studies 77% 23% 751 84% 16% 545 83% 17% 386

k) Friends helped me to prepare for the exams 61% 39% 749 50% 50% 543 56% 44% 387

l) I used revision apps 87% 13% 752 85% 15% 544 88% 12% 387

m) I took one-to-one or small-group grinds 86% 14% 752 87% 13% 545 89% 11% 389

n) I attended a grinds school 92% 8% 752 92% 8% 545 95% 5% 388

Learning support by gender

Table A22. Engl ish: learning support by gender (percentages)

Females Males

No Yes n No Yes n

a) Which topics were likely to come up was explained to me 28% 72% 487 20% 80% 260

b) Marking criteria were explained to me 19% 81% 487 18% 82% 259

c) I used material from grinds websites 65% 35% 488 68% 32% 261

d) Model answers were given to me 19% 81% 486 17% 83% 260

e) I was given past papers 6% 94% 487 8% 92% 260

f) I have textbooks to help with my study 18% 82% 488 18% 82% 259

g) The exam format was explained to me 8% 92% 486 8% 92% 260

h) I used revision guides 47% 53% 486 46% 54% 260

i) I looked at past papers on the internet 37% 63% 488 30% 70% 259

j) My parents helped me with my studies 76% 24% 490 80% 20% 260

k) Friends helped me to prepare for the exams 62% 38% 488 60% 40% 260

l) I used revision apps 87% 13% 489 88% 12% 262

m) I took one-to-one or small-group grinds 85% 15% 490 89% 11% 261

n) I attended a grinds school 92% 8% 490 92% 8% 261

Page 71: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

71

Table A23. Biology: learning support by gender (percentages)

Females Males

No Yes n No Yes n

a) Which topics were likely to come up was explained to me 36% 64% 377 26% 74% 167

b) Marking criteria were explained to me 24% 76% 375 22% 78% 167

c) I used material from grinds websites 72% 28% 376 76% 24% 167

d) Model answers were given to me 72% 28% 376 63% 37% 167

e) I was given past papers 5% 95% 376 5% 95% 167

f) I have textbooks to help with my study 8% 92% 375 4% 96% 167

g) The exam format was explained to me 14% 86% 375 14% 86% 167

h) I used revision guides 38% 62% 374 32% 68% 167

i) I looked at past papers on the internet 27% 73% 375 24% 76% 167

j) My parents helped me with my studies 85% 15% 378 82% 18% 167

k) Friends helped me to prepare for the exams 49% 51% 376 52% 48% 167

l) I used revision apps 85% 15% 377 85% 15% 167

m) I took one-to-one or small-group grinds 87% 13% 378 87% 13% 167

n) I attended a grinds school 91% 9% 378 94% 6% 167

Table A24. Geography: learning support by gender (percentages)

Females Males

No Yes n No Yes n

a) Which topics were likely to come up was explained to me 30% 70% 217 18% 82% 167

b) Marking criteria were explained to me 11% 89% 218 15% 85% 168

c) I used material from grinds websites 75% 25% 219 70% 30% 168

d) Model answers were given to me 9% 91% 218 18% 82% 168

e) I was given past papers 4% 96% 218 10% 90% 167

f) I have textbooks to help with my study 7% 93% 218 8% 92% 167

g) The exam format was explained to me 8% 92% 218 10% 90% 167

h) I used revision guides 40% 60% 218 36% 64% 168

i) I looked at past papers on the internet 39% 61% 218 31% 69% 167

j) My parents helped me with my studies 81% 19% 219 86% 14% 167

k) Friends helped me to prepare for the exams 56% 44% 218 57% 43% 169

l) I used revision apps 89% 11% 218 86% 14% 169

m) I took one-to-one or small-group grinds 90% 10% 220 88% 12% 169

n) I attended a grinds school 96% 4% 219 93% 7% 169

Page 72: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

72

Learning support and exam results

Table A25. Learning support and exam results (mean scores (M) and sample size

(n))

English Biology Geography

No Yes No Yes No Yes

a) Which topics were likely to come up was explained to me M 72.04 69.51 67.20 71.89 73.15 72.74

n 147 451 134 302 65 232

b) Marking criteria were explained to me M 66.93 70.87 60.51 73.34 65.31 73.76

n 109 488 99 335 32 267

c) I used material from grinds websites M 70.65 69.15 72.04 66.45 74.14 69.35

n 394 205 314 121 215 84

d) Model answers were given to me M 72.69 69.54 72.27 66.27 70.98 73.14

n 108 489 297 138 41 258

e) I was given past papers M 71.00 70.10 67.78 70.54 75.45 72.68

n 35 563 18 417 22 276

f) I have textbooks to help with my study M 69.33 70.34 70.71 70.39 76.43 72.62

n 105 493 28 406 21 277

g) The exam format was explained to me M 65.11 70.52 56.23 72.56 71.15 73.05

n 46 551 57 377 26 272

h) I used revision guides M 71.42 69.01 71.05 70.11 75.79 70.95

n 278 319 157 276 114 185

i) I looked at past papers on the internet M 70.00 70.28 67.04 71.60 71.86 73.45

n 201 397 113 321 105 193

j) My parents helped me with my studies M 70.07 70.42 70.48 70.28 73.53 69.47

n 470 130 366 71 251 47

k) Friends helped me to prepare for the exams M 71.01 68.78 73.89 67.24 73.67 71.65

n 365 233 212 223 166 133

l) I used revision apps M 70.62 66.84 70.92 67.76 73.53 67.80

n 523 79 369 67 259 41

m) I took one-to-one or small-group grinds M 70.68 67.08 71.37 64.84 73.74 64.38

n 517 84 375 62 269 32

n) I attended a grinds school M 70.45 66.63 70.43 70.66 72.97 68.21

n 555 46 399 38 286 14

Page 73: Working Paper 3: Student Questionnaire · Questionnaire (MSLQ) (Pintrich, Smith, Garica & McKeachie, 1991). The limited validity of this questionnaire for measuring learning strategies

© University of Oxford and Queen’s University Belfast

Oxford University Centre for Educational Assessment

http://oucea.education.ox.ac.uk/