DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and...

33
1 DETERMINING AND MONITORING THE LANGUAGE PROFICENCY STANDARDS OF APPLICANTS SEEKING NAATI ACCREDITATION - Report for the National Accreditation Authority for Translators and Interpreters (NAATI) on a workshop convened at the University of Melbourne on October 29 & 30, 2016 Catherine Elder & Ute Knoch Language Testing Research Centre The University of Melbourne November 2016

Transcript of DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and...

Page 1: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

1

DETERMINING AND MONITORING THE LANGUAGE PROFICENCY STANDARDS OF APPLICANTS SEEKING

NAATI ACCREDITATION -

Report for the National Accreditation Authority for Translators and

Interpreters (NAATI) on a workshop convened at the University of Melbourne

on October 29 & 30, 2016

Catherine Elder & Ute Knoch Language Testing Research Centre

The University of Melbourne November 2016

Page 2: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

2

Contents 0. Executive summary ............................................................................................................................................. 3

1. Background ......................................................................................................................................................... 7

2. Workshop participants ....................................................................................................................................... 8

3. Workshop procedures ........................................................................................................................................ 8

4. Findings ............................................................................................................................................................. 11

5. Conclusions and Recommendations ................................................................................................................. 18

6. References ........................................................................................................................................................ 23

7. Appendices ....................................................................................................................................................... 24

Appendix A. List of participants from the interpreter and translator professions ........................................... 24

Appendix B. Pre-workshop task ........................................................................................................................ 25

Appendix C. CEFR overall descriptors for Listening, Speaking, Reading and Writing ...................................... 26

Appendix D. Versant evaluation form .............................................................................................................. 29

Appendix E. Standard-setting results ............................................................................................................... 30

Appendix F. Standard-setting evaluation form ................................................................................................. 32

Appendix G. Workshop evaluation form .......................................................................................................... 33

Page 3: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

3

0. Executive summary We report here on the proceedings and outcomes of a two-day workshop commissioned by the National Accreditation Authority for Translators and Interpreters (NAATI)held at the University of Melbourne in October 2016. Workshop convenors, Ass. Prof. Cathie Elder and Dr. Ute Knoch from the University’s Language Testing Research Centre, met and consulted with 17 representatives of the translator and interpreter professions recruited by NAATI. The workshop had the dual aim of a) setting minimum language proficiency standards in English and languages other than English (LOTEs) for interpreting and translating purposes, and b) reviewing the means by which the attainment of these minimum standards could be determined for those seeking NAATI accreditation. The workshop was one of a series of initiatives commissioned by NAATI in relation to a proposed revision to the procedures for NAATI accreditation. A study by Hale et al (2012) had signalled the need for preliminary language proficiency screening as the first stage in the accreditation process to guide applicants in determining their readiness to sit for the relevant interpreter or translator test. Two further reports by Elder, Knoch & Kim (2016a and 2016b) outlined some fundamental principles which should underpin the choice of appropriate tests for the intended purpose of voluntary self-access language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need for an overarching language proficiency framework to set minimum language proficiency standards for each profession and as a means of understanding results on the language tests chosen for screening purposes. The Common European Framework of Reference (CEFR), widely used by educational systems in Europe and beyond, was chosen as a point of reference for the standard-setting exercise. The CEFR divides language proficiency into three broad Bands: Band A-Basic User, Band B–Independent user and Band C-Proficient User. Each band is further subdivided into two finer levels (A1 and A2, B1 and B2 and C1 and C2), each of which is accompanied by a description of the typical learner’s ability in the form of a series of ‘Can do’ statements. Overall descriptors are provided for each skill (see Appendix C) along with a more detailed breakdown of sub-skill components to aid users in reaching a common understanding of the meaning of each level. The first day of the workshop began with familiarizing participants with the CEFR level descriptors for the skills of Listening, Speaking, Reading and Writing respectively. Participants’ views were then canvassed on which descriptor best characterized what they would consider the minimally acceptable level required for performance of the interpreter and translator role, both at the provisionally certified (or paraprofessional) level and the certified (or professional) level. Participants were also asked to rate samples of work at different CEFR levels as a means of firming up their judgements of what constituted minimally acceptable performance. Judgements were then pooled to arrive at a consensus view of the minimally acceptable standards required. During the second day of the workshop participants were introduced to a number of existing language tests in English and LOTE identified in our earlier reports to be potentially suitable for language screening purposes. All of these tests were designed for online delivery and automatically scored. Participants were briefed on the qualities of these tests and, where feasible, given a chance to try out the tests for themselves. Their views on the appropriateness of each test for its intended purpose were discussed, and test evaluation forms were completed. Draft specifications and tasks for a custom-built language screening test, to be owned and administered by NAATI, were also reviewed.

1 Full descriptions of the English and LOTE testing options are provided in Elder et al 2016a and 2016b respectively.

Page 4: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

4

Analysis of feedback from the workshop participants, presented in detail in the report that follows, yielded the following recommendations for consideration by NAATI. It should be noted that some of these constitute a refinement of what was proposed in our previous reports.

It is recommended that:

1. the following CEFR levels proposed by the workshop participants be tentatively adopted by NAATI

as indicating the minimally acceptable standards for each profession in both English and LOTE.

Provisionally certified interpreter Certified interpreter

Listening B1+ C1

Speaking B2 C1

Reading B2 C1

Writing B1 B2

Recognised translator Certified translator

Listening - -

Speaking - -

Reading B2 C1

Writing B2 C1

2. the above minimum standards should be used as a basis for determining cut-scores or thresholds

for satisfactory performance on the various test options (see below) that are proposed for screening.

3. the above levels /minimum standards be reviewed over time and if necessary adjusted with

reference to a) exit proficiency levels achieved by students completing NAATI approved courses and

b) evidence comparing CEFR levels of candidates (as reflected in results on the language proficiency

tests recommended below) with subsequent performance on the relevant NAATI tests2.

4. the following tests be adopted for screening of candidates’ English proficiency, giving due

consideration to the status of testing - whether on a voluntary basis for the purpose of guiding NAATI

applicants as to their readiness to take the relevant NAATI test, or as a compulsory requirement (for

Recognition purposes).

For interpreters:

2 Particular attention should be paid to monitoring the appropriate level for writing for translators as the levels set during

the review of the CEFR descriptors, as noted under Findings above, did not corroborate the findings of the writing samples

review for these professions.

Page 5: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

5

Voluntary screening Compulsory screening

Dialang English (all tasks)

AND (for those performing close to or above the minimum CEFR standard on Dialang) Versant English (speaking) (self-access)

Versant English (proctored by NAATI)

OR (on request) Custom test (NAATI administered and rated)

OR (on request) Custom test (NAATI administered and rated)

For translators:

Voluntary screening Compulsory screening

Dialang English (all tasks)

AND (for those performing close to or above the minimum CEFR standard on Dialang) Versant Writing (self-access)

Versant Writing (proctored by NAATI)

OR (on request) Custom English test (administered & rated by NAATI panel)

OR (on request) Custom English test (administered & rated by NAATI panel).

5. the following tests be adopted for screening of candidates’ proficiency in LOTE, giving due

consideration to the status of testing - whether on a voluntary basis for the purpose of guiding NAATI

applicants as to their readiness to take the relevant NAATI test, or as a compulsory requirement (for

Recognition purposes).

Page 6: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

6

For interpreters:

Voluntary screening Compulsory screening

Dialang (all tasks) (for Danish, Dutch, Finnish, French, German, Greek, Icelandic, Irish-gaelic, Italian, Norwegian, Portuguese, Spanish and Swedish)

AND /OR Versant (Speaking) (for Arabic, Chinese, Dutch, French and Spanish)

Versant Speaking (proctored) (for Arabic, Chinese, Dutch, French, Spanish)

OR ACTFL (OPIc) (for Arabic, Bengali, English, French, German, Italian, Indonesian, Japanese, Korean, Pashto, Persian Farsi, Portuguese, Russian, Spanish and Tagalog)

OR ACTFL OPIc (certification version) (for Arabic, Bengali, English, French, German, Italian, Indonesian, Japanese, Korean, Pashto, Persian Farsi, Portuguese, Russian, Spanish and Tagalog)

OR Custom test (NAATI administered and rated) (for all other languages)

OR Custom test (NAATI administered and rated) (for all other languages)

For translators:

Voluntary screening Compulsory screening

Dialang (all tasks) (for Danish, Dutch, Finnish, French, German, Greek, Icelandic, Irish-gaelic, Italian, Norwegian, Portuguese, Spanish and Swedish) if language is available

AND/OR ACTFL (WPT) (for Arabic, Chinese-

Cantonese, Chinese-Mandarin, Danish,

French, German Greek, Hindi, Italian,

Japanese, Korean, Norwegian, Polish,

Portuguese, Russian Spanish, Turkish and

Vietnamese)

ACTFL WPT (certification version)

(for Arabic, Chinese-Cantonese,

Chinese-Mandarin, Danish,

French, German Greek, Hindi,

Italian, Japanese, Korean,

Norwegian, Polish, Portuguese,

Russian Spanish, Turkish and

Vietnamese)

OR Custom test (NAATI administered and rated) (for all other languages)

OR Custom test (NAATI administered and rated) (for all other languages)

Page 7: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

7

6. where English or LOTE screening is implemented as a requirement, the recommended tests

should be administered under proctored conditions so that NAATI can ascertain the identity of

candidates, directly access their test results and have confidence in their integrity.

7. to develop the custom tests mentioned above, NAATI commission a project to refine the draft

specifications and sample tasks used for the current workshop so that a) appropriate adjustments

can be made to reflect the different needs of the interpreter and translator professions, b) trialling

can be undertaken and c) suitable training materials can be developed for test implementation.

8. a series of workshops for NAATI examining panels be convened for the purpose of a) familiarizing

participants with the test specifications and test tasks, b) developing alternative versions of these

tasks as required, c) briefing participants in procedures for administering the custom tests to the

relevant language groups and d) training them in the use of a common CEFR-linked language

proficiency rating scale to score performance on these tasks. This process should begin with English

and then proceed to LOTEs in high demand where no other screening test option is available.

9. NAATI clearly articulate a policy in relation to both the minimally acceptable proficiency standards

in English and LOTE for the interpreter and translator professions and the language proficiency

screening tests which it proposes as a means of gauging applicants’ status in relation to these

standards.

10. this policy be communicated to stakeholders through the creation of a website with language-

specific links giving information about the recommended language screening options for each

language, the manner in which the relevant tests can be accessed by candidates, how results can be

interpreted and what actions are advised in relation to these results.

1. Background This report describes the proceedings and outcomes of a two-day workshop for a panel of NAATI accredited interpreters and translators held at the University of Melbourne in October 2016. The workshop arose from the recommendations of two previous reports (Elder, Knoch and Kim 2016a and 2016b), which reviewed options for language proficiency screening (in English and the relevant language other than English (LOTE)) of applicants seeking NAATI recognition or accreditation as interpreters and translators. The need for preliminary language proficiency screening had been identified by Hale et al. (2012) as the first stage in a revised model for NAATI accreditation. The screening tests were to be delivered online and autocorrected for the purpose of alerting potential applicants to the language requirements of the professions. The perceived benefit of preliminary screening would be to reduce the unduly high failure rate on NAATI interpreter and translator tests by discouraging those who were not ready to sit these exams from investing further in the accreditation process until they had had a chance to develop their language skills to an appropriate level. The reports by Elder et al. highlighted the need for agreed minimum standards of language proficiency for interpreters and translators using a recognized language levels framework as a point of reference. These minimum standards could then be used as a basis for determining the level of performance required on whatever tests were chosen for screening purposes. It was also recommended that input be sought from a representative sample of interpreting and translating professionals for the purpose of setting these standards. The two reports also reviewed a range of available tests in English and LOTE

Page 8: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

8

that might meet the preliminary screening requirement and proposed a number of options for consideration by NAATI. The subsequent workshop held in Melbourne, which will be described in detail below, served the dual purpose of: a) setting minimum language proficiency standards for interpreters and translators and b) reviewing the proposed options for language proficiency screening.

2. Workshop participants The convenors of the workshop were Ass. Prof. Cathie Elder and Dr. Ute Knoch, both specialists in language testing and authors of the earlier reports. Cathie Elder is Principal Fellow and Ute Knoch is Director of the LTRC in the School of Languages and Linguistics at the University of Melbourne. Dr Kim, co-author of the abovementioned language reports, was also in attendance. A NAATI representative, Lara Wiseman, also attended and assisted with organizational matters before, during and following the workshop. Representatives from the interpreter and translator professions were recruited via an email from NAATI calling for expressions of interest from accredited translators and interpreters in Victoria. A subset of volunteers were then chosen with an eye to achieving a good balance of:

interpreting and translating experience;

paraprofessional and professional level accreditation;

native speakers and speakers of English as an additional language;

representation of a broad range of LOTEs including high demand and new/emerging languages; and

experience of examining and educating interpreters. A list of the 17 participants and the languages they represented is provided in Appendix A. Four reported English to be their native language. The other languages covered were Amharic, Arabic, AUSLAN, Dari,-Hazaragi Chinese, Croatian, Greek, Finnish, German, Hindi, Indonesian, Macedonian Marathi, Pashto, Persian, Russian, Serbian, Singhalese, Somali, Tigrinya and Vietnamese. Most participants had both translator and interpreter qualifications at Level 3 but there were also a number of Level 2 professionals present, particularly in the new and emerging languages. One participant was a qualified conference interpreter. The majority had substantial experience in their profession. The mean years of experience for interpreting was 19.7 (ranging from a minimum of 6 to a maximum of 29). For translating the mean was 18.6 (ranging from a minimum of 3 years to a maximum of 29). All but one participant had worked as a NAATI examiner and four had served or were currently serving as panel chairs. Fourteen had also worked or were currently working as educators in the profession either on a free-lance basis or at one of the following institutions:

Defence Force School of Languages, Point Cook

Western Sydney University

TAFE NSW

Monash University

Deakin University

RMIT In sum, the group could be seen as representing a range of languages and different dimensions of the interpreting and translating role and were therefore well qualified to speak on behalf of these professions.

3. Workshop procedures Pre-workshop

Page 9: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

9

Prior to the workshop participants were asked to reflect on the kinds of knowledge and skills typically involved in performing tasks associated with the interpreter and translator role (see Appendix B for the task prescribed). This was a means of preparing participants for the standard-setting activities which took place during Day One of the workshop. Day One – Standard setting After briefing participants about the purpose of the workshop, and discussing the limitations of publically available definitions of language proficiency for the interpreting and translating professions3 for the interpreting and translating professions, the convenors introduced participants to the notion of a language levels framework or scale of language ability and pointed to the benefits of such a framework as a means of indicating standards of ability for professional or academic purposes. It was explained that a proficiency framework was not a test but could serve as a basis for setting standards and for linking and interpreting results from different tests. Various frameworks were reviewed (see Elder et al 2016a and 2016b for further detail) including the Common European Framework of Reference (CEFR), which, given its international currency, was chosen as the most appropriate one for the purpose of setting standards for the interpreter and translator professions. The CEFR divides proficiency into three broad Bands: Band A-Basic User, Band B–Independent user and Band C-Proficient User. Each band is subdivided into two finer levels A1 and A2, B1 and B2 and C1 and C2. There are descriptors for each level overall and also on a skill-by-skill basis and these were distributed to participants. (The overall descriptors for each skill are reproduced in Appendix C and a complete set of descriptors can be found at http://ebcl.eu.com/wp-content/uploads/2011/11/CEFR-all-scales-and-all-skills.pdf.) It was the task of those present to determine which levels would qualify as minimally acceptable standards for each profession. It was explained that 4 different standards would be set: 2 for interpreters (at the provisionally certified and certified level) and 2 for translators (at the recognised and certified levels). Participants were also given the opportunity to set different standards for English and LOTE, should they believe this was necessary or appropriate. The steps in the process of standard-setting were then outlined for participants as follows:

Consider the typical tasks involved in interpreting and translating (pre-workshop preparation)

Familiarize yourself with the CEFR descriptors (skill by skill)

Write down on the form provided the CEFR level that best matches your notion of the minimum language competence needed to perform these typical tasks adequately

Discuss with your colleagues the reasons for your choice

Record your final judgment regarding the minimum level of competence required (You may alter what you first put if the discussion has led you to change your mind)

Hand the form with your completed judgement to the workshop convenor. These steps were implemented in turn for Listening, Speaking, Reading and Writing. For Speaking and Writing however there was an additional step in the standard-setting process. For each of these two skills, once participants had recorded their final judgement about the minimum CEFR levels required for each profession, they were asked to review a number of randomly ordered samples of speaking and writing performance representing different levels of ability and to decide in each case if the speaker or writer was competent (i.e., had reached or exceeded their minimum standard) or not. The samples had previously been benchmarked to the CEFR scale but this information was not shared with participants. The convenors were able to use the resultant data to check whether each participant’s judgements of these samples were aligned with the CEFR level previously chosen as representing a

3 See https://www.naati.com.au/media/1370/draft-descriptors-for-t-and-i_july-2016.pdf

Page 10: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

10

minimally competent standard. Outcomes of the standard-setting process reported under Findings below. Day Two – Review of screening test options The second day of the workshop was dedicated to reviewing a number of tests in English and LOTE that had been shortlisted in our previous reports for NAATI as potential candidates for preliminary language proficiency screening. It should be noted that all the shortlisted tests had either been formally linked to the CEFR scale or were in the process of being linked to this scale. For English the options were: Versant English (Speaking), developed by Pearson; Versant Writing, also developed by Pearson; the Academic English Screening test (AEST) developed at the University of Melbourne; and the Dialang test developed at the University of Lancaster. For LOTE we had shortlisted the following options: the Melbourne University Placement tests in: French, German, Arabic, Italian, Chinese, Indonesian,

Japanese, Spanish and Russian; the Versant speaking tests in: Arabic, Chinese, Dutch, French, Spanish; the Dialang tests in: Danish, Dutch, English, Finnish, French, German, Greek, Icelandic, Irish-gaelic,

Italian, Norwegian, Portuguese, Spanish and Swedish;, the American Council for the Teaching of Foreign Languages (ACTFL) computerized Oral Proficiency

Interview (OPI) Speaking in: Arabic, Bengali, Chinese, Dutch, French, German, Italian, Indonesian, Japanese, Korean, Pashto, Persian Farsi, Portuguese, Russian and Tagalog; and

the ACTFL Writing Proficiency Test (WPT) in: Arabic, Chinese-Cantonese, Chinese-Mandarin, Danish, French, German Greek, Hindi, Italian, Japanese, Korean, Norwegian, Polish, Portuguese, Russian Spanish, Turkish and Vietnamese.

In addition to the LOTE options listed above, we had proposed a custom-built test for those languages not represented in the above list. (For further information about all the above options see Elder et al. 2016a and 2016b). To prepare participants to review these options we had briefed participants (at the end of the first day) on the principles governing the choice of tests for language proficiency screening purposes (see Elder et al 2016a and 2016b for further details). For each of the English test options listed above participants were given a background briefing on key features of the test including information about their reliability and their cost to candidates. Participants were given the chance to try out all or part of each test either on the phone (in the case of Versant), online (in the case of Versant Writing and Dialang) or in pen-and-paper form (in the case of the AEST). After attempting each test participants discussed their views and the completed an evaluation sheet (see Appendix D for an example). We concluded the review with a discussion of the relative merits of each option for the intended purpose. Results of these evaluations and the final discussion are summarized under Findings below. For the LOTE test options it was not feasible to have participants try out the tests themselves both due to time limitations and because there were no tests available in some of the languages known by the participants (i.e., Amharic, AUSLAN, Dari-Hazaragi, Macedonian Marathi, Pashto, Persian, Serbian, Singhalese, Somali, Tigrinya are not represented in the above list). We therefore limited ourselves to briefing participants about the qualities of each test. (We did not need to do this for Dialang or Versant as participants had already tried out the English version of these tests.) To familiarize participants with what we had in mind for the NAATI-administered custom-built test we issued participants with some draft test specifications with accompanying tasks and had them try out these tasks in English in groups. Participants then discussed this option before completed an evaluation form. The session concluded with a brief discussion about the relative merits of the various LOTE testing options for the intended

Page 11: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

11

screening purpose. Participants then filled out a form seeking their evaluative ratings and comments on the workshop as a whole. Findings are reported below.

4. Findings

4.1. Standard setting Participants were asked to select the minimum necessary level on the CEFR in each of the four language skills for each of Provisionally certified interpreter (replacing the the current paraprofessional level accreditation), certified interpreter (replacing the current professional level accreditation), recognised translator and certified translator ( replacing the current professional level accreditation) . If a certain skill was deemed not applicable to a particular professional role, participants were asked not to provide a minimum standard for that skill. While there was some variation in the participants’ judgements, it was possible to identify which level on the CEFR the majority selected as the minimum level required. The judgements of all participants can be found in Appendix E. Tables 1 and 2 below present the agreed standards for interpreters and translators for each of the language skills deemed relevant. In the case of writing and speaking, as noted under procedures above, participants were also asked to review actual speech and writing samples from existing English language tests. These had all been previously benchmarked against the CEFR and were provided to participants in no particular order and without the agreed level. This was done to corroborate the levels provided by the review of the CEFR levels. In the case of interpreters, the review of the samples resulted in the same agreed standards for writing and speaking as the review of the CEFR descriptors. This was, however, not the case for the skill of writing for translators, where for recognised translators the participants chose B1 when reviewing the CEFR descriptors and B2 when looking at writing samples. Similarly, for certified or professional translators, participants chose C1 when reviewing the CEFR descriptors and B2 when reviewing the writing samples. Intuitively, the higher standards make sense in this case (i.e. B2 for recognised translators and C1 for certified translators) and have therefore been underlined in Table 2, but this may have to be examined further in the future. Table 1: agreed standards for interpreters

Provisionally certified interpreter Certified interpreter English LOTE English LOTE

Listening B1+ B1+ C1 C1 Speaking B2 B2 C1 C1 Reading B2 B2 C1 C1 Writing B1 B1 B2 B2

Table 2: Agreed standards for translators

Recognized Translator Certified translator English LOTE English LOTE

Listening - - - - Speaking - - - - Reading B2 B2 C1 C1 Writing B1/B2* B1 B2/C1* C1

*Note: CEFR descriptor review and writing sample review resulted in different standards

Evaluation of the standard-setting activity

Page 12: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

12

Evaluation forms were completed by 16 participants early on the second day of the workshops

following a brief report on the results obtained from the previous day’s activity (see Appendix F for a

copy of the form). Thirteen reported understanding the process of setting standards either well or

very well (with ratings of 4 or 5 on a five-point scale) and deemed the process an effective means of

determining minimally acceptable standards for the translating and interpreting profession. The

CEFR descriptors were also seen as helpful for making decisions about appropriate levels by nearly

all (14) participants. The same was true for the speaking and writing samples, which were used to

corroborate participants’ judgements about what constituted a minimally competent level of

proficiency for translating or interpreting purposes. Only two respondents found these samples

unhelpful, with one reporting that judging writing samples in English was very difficult for him. This

may well have been a reflection of the respondent’s English proficiency, rather than a problem with

the process itself.

One participant commented that language skills were not the only competencies required for

interpreting and translating and this is most certainly the case. Another commented on the

desirability of considering cultural understanding in setting standards. While we acknowledge the

multiple areas of knowledge and skill required for the interpreting and translating professions, our

brief was to set minimum standards of language proficiency independently of other competencies.

Finally, and most importantly, a high level of confidence was reported by participants in the

standards that emerged from the workshop. Mean scores for this question were in the vicinity of 4

(on a five-point scale) for the Listening, Reading, Speaking and Writing skill respectively. One

participant however pointed out that, despite having confidence in the standards set, the

participants’ judgements were inevitably subjective and biased. This is indeed true, since standard

setting is no more than a process of eliciting value judgements from those who are qualified in a

particular area. The defensibility of the resultant standards rests on the representativeness of the

expert group and their professionalism and also the systematic nature of the process which allows

for subjective judgements to be pooled in such a way as to determine the prevailing view of what

the minimum standard should be (Cizek & Bunch, 2007).

Another participant was concerned about our equating the ‘recognised’ interpreter and translator

status with the provisionally certified or paraprofessional level, stating that the notion of

‘recognition’ needs to be more clearly defined. This is a clearly a policy matter for NAATI and not

within the purview of the authors of this report.

4.2. English testing options Versant English (Speaking)

Twelve workshop participants took the Versant English test on the phone at home after the first day

of the workshop. One who failed to do so had some problems with the technology. Another had left

the session before the instructions and ID numbers were distributed. Others did not attempt the test

due to other commitments. Of those who completed the test the majority (9) were comfortable or

very comfortable with the test-taking process. Those who responded less positively reported that

the sound quality on the phone was problematic in that the voices were too soft. One commented

that the quality of the recording was particularly poor towards the end of the test and another

reported hearing an echo on the line. American accents posed a problem for two participants.

Page 13: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

13

Most (9 out of 12) reported being familiar with the item types on the test, although one commented

on the need for test takers to familiarize themselves with the test format before embarking on the

test in order to be prepared for some of the more unusual item types, some involving numerical

operations. (A practice test on the website makes this possible.)

All but one of the participants felt satisfied or very satisfied with the test’s capacity to measure

speaking ability and 9 of the 12 also felt it measured listening ability adequately or very adequately.

This favourable response was characterized by one participant as follows:

“The linguistic and cognitive processing skills tested in the instrument parallel closely the

skills at paraprofessional level and reflect challenges in common dialogue interpreting

practice. “

Another commented positively on the range of speaking and listening skills covered which “perhaps

unwittingly replicates a large number of things that interpreters need to be able to perform;

shadowing; automatic thinking on the spot based on available speech; retelling a narrative”.

The very detailed score reports were rated highly by 11 out of the 12 participants. One person

commented that he would like to have been given a score for each different test segment. (This

would not however be feasible given that ratings for different aspects of speaking and listening

ability are calculated across different tasks, rather than on a task-by-task basis.)

Taking everything into consideration, 7 of the 12 participants felt that the test would be suitable or

highly suitable for screening purposes for interpreters. The remaining five, while less enthusiastic

(giving a rating of only 3 out of 5), were nevertheless not opposed to this option.

Versant Writing

Participants who attended the second day of the workshop attempted this test online.

Unfortunately, we faced some problems with connections to the internet which had not been

anticipated. One person tried out the test on his tablet, which was contrary to advice from the

testing agency and created problems for the test-taker due to the small size of the screen.

Nine participants ended up completing the test. Only one of these reported being not very

comfortable with the test taking process. Six of the participants were familiar or very familiar with

the item types on the test and 3 gave a rating of 3 out of 5, suggesting that some questions may

have been new to them. The majority (6 out of 9) believed that the test was a good or very good

measure of their reading ability and similar responses were given regarding the test’s ability to

measure writing skills.

Views of the test (whether positive or negative) were reflected in responses regarding the accuracy

of the score received, i.e., the two respondents who had reservations about the test construct also

had little faith in the accuracy of the final score whereas the reverse was true for those who judged

the test positively.

One participant expressed concern about the typing test at the start of the test, wondering if typing

speed might interfere with the assessment of writing skill. (In fact it is made clear on the test

website that while type speed is not factored into the final score, those with inadequate typing skills

Page 14: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

14

are unlikely to perform well on the test since they may not be able to complete the questions in the

time provided.)

One or two highly proficient test takers received lower scores than might be expected, and

suggesting perhaps that the test does not measure proficiency accurately at the upper end of the

scale. All respondents were impressed with the score report, finding it very useful in the level of

detail provided.

Taking all these factors into consideration, the majority of participants indicated support for the

test’s suitability for screening purposes, with 5 of the 9 respondents rating it highly or very highly

(i.e. 4 or 5 on the five-point scale). The remaining 4 gave the test a rating of 3 out of 5, indicating

that they found it neither suitable nor unsuitable for screening purposes. A comment from one of

the participants shows awareness of both the limitations and the strengths of the test:

“It’s not bad for a 30-minute test. The first section is a warm up but tests candidates’ eye for

detail. Section B is good for collocations and vocab repertoire. Dictation is good as a

reflection activity and attending to the form or the source of text. The passage reconstruction

is good, excellent in fact - as it requires narrative/passage recollection. The last exercise is a

free composition exercise.“

Academic English Screening Test (AEST)

Participants had the opportunity to try out the two task types on this test using a pencil and paper

(rather than an online) version.

The written feedback offered by 14 of the workshop participants was rather variable. While most (11

out of 14) were comfortable or reasonably comfortable (i.e. chose a four or five on the five-point

scale) with the test-taking process, five respondents were either unfamiliar or not very familiar with

the item types on this test. Only half of the respondents considered the test adequate as a measure

of the underlying skills and knowledge required for interpreting and translating and the majority had

some doubt about its suitability for the intended screening purpose.

A number of participants however felt that the test could be useful in assessing linguistic readiness

for the translating role (rather than for interpreting). These respondents felt the tasks were a good

measure of comprehension and in particular that the skills involved in parsing the text (for the cloze

elide task) were relevant to a translator’s editing role in revising or checking written translations. The

general view expressed during the workshop was however that this test had notable limitations and

if considered, should only be used in conjunction with other forms of screening.

Dialang

The Dialang is made up of a number of different skill components, all delivered online. All

participants took the preliminary self-assessment and placement tasks, designed to determine which

test level (beginner, intermediate or advanced) was most appropriate to their ability. Due to time

limitations (all components of the test can take up to 2 hours to complete), participants were

assigned to one of five different groups each of which was instructed to take a different test

component namely: 1) Structure; 2) Vocabulary; 3) Listening; 4) Reading and 5) Writing.

Page 15: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

15

Some candidates required a little help in getting started but most adapted quickly to the test

requirements. Regardless of which component they took, all participants felt that the self-

assessment task administered at the beginning of the test was a valuable exercise. (Candidates

receive immediate feedback on their self-assessed level using the CEFR scale as benchmark and are

then able to compare their actual level with the self-assessment).

All four participants assigned to the Structure group found the test level appropriate for their ability

and only one felt that the task types were unfamiliar. Three were highly confident that the items

measured their ability accurately and one was undecided. The same three participants found the

test feedback extremely useful and all considered this test component was appropriate as a

screening tool, if used in conjunction with other test components.

The three participants in the Vocabulary group were reasonably comfortable with the test-taking

process and felt that this test component measured relevant areas of vocabulary knowledge

(including collocations) using an appropriate variety of item types. All were confident in the accuracy

of the test outcome and believed that the feedback provided was very useful (especially the option

of getting immediate feedback on whether an individual item had been answered correctly). This

test component was believed to serve its screening purpose adequately alongside the other

components in the test battery.

All but one of the six participants in the Listening group were comfortable with the test taking

process, with one commenting specifically on its user-friendliness. These same participants felt that

the tasks were pitched appropriately and measured relevant skills. One respondent commented

favourably on the fact that the test was international in its focus rather than ‘Australia-centric’. The

consensus view was that this would be a suitable test for the intended self-access screening purpose

(although one person commented that the test items that focussed on implications were less useful

than the others because the role of the interpreter was to focus on what they need to transfer

accurately rather than on drawing inferences). Of the five who completed the Listening section three

found the feedback offered useful.

There were two participants in the Reading group. Both rated the test highly in all respects including

the ease of the test-taking process, familiarity with the item types and the suitability of the test for

the screening of translators. One of the participants (who took the Finnish rather than the English

version of the test) appreciated the feedback provided including the suggestions offered for

improvement.

The two participants in the Writing group also commented very favourably on this component,

considering it easy to navigate and pitched appropriately. Both were confident in the accuracy of the

test outcome and felt the test would be suitable for its intended screening purpose. Interestingly,

neither commented on the fact that this component assesses writing ability indirectly, using a

multiple choice or short answer format, rather than requiring the production of extended written

text. It seems that participants were prepared to accept this limitation, perhaps on the grounds of

practicality.

Page 16: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

16

Summary discussion

The summary discussion held at the end of this part of the workshop revealed the following opinions

in relation to English proficiency screening. Note that preferences varied according to whether the

screening test was simply to be offered as an option for those seeking to gauge their readiness to

take the relevant NAATI exam or instead to be used as a compulsory pre-requisite for accreditation4.

For interpreter screening:

Dialang (English - all tasks), because it was free and user-friendly, was considered to be the best

option for optional self-access screening for interpreters, followed by Versant English (speaking), at

a cost of around US$40 per candidate, as an additional check for those who reached an appropriate

CEFR level (or tested very close to the level) on the Dialang.

If the screening test were to be made compulsory for those seeking recognition for example, the

Dialang would not be suitable due to lack of security. The computer-delivered version of Versant

English, administered under proctored conditions, was favoured for this purpose.

For Translator screening:

Dialang (English- all tasks) was again selected as the best option for optional self-access screening

for translators, followed by Versant Writing as an additional check for those who reached an

appropriate level on Dialang.

For compulsory screening only Versant Writing, again administered under proctored conditions,

would be suitable (on security grounds).

The AEST was ranked lower than these other tests for both optional and compulsory screening

purposes.

Another option for compulsory screening of both interpreters and translators which emerged from

the later discussion of LOTE testing options (see below) was that a custom test in English (based on

the LOTE specifications presented to participants) be used and administered and rated against the

CEFR scale by a trained English panel. This option was felt to be suitable as a requirement for the

circa 100 candidates applying for recognition. It could also be made available on a voluntary basis for

those individuals simply wishing to check their English proficiency and being willing to pay for this

service.

4.3. LOTE testing options As noted above, since the varied background of the participants made it impossible for the group to

try-out the various test options themselves, evaluations of individual tests were not undertaken.

Views of the shortlisted LOTE tests were nevertheless canvassed at the end of the session based on

the information provided (see under Discussion of LOTE options below). Participants were also asked

to evaluate a proposal for development of tailor-made tests to be administered by NAATI panels for

a fee in those languages where no other test option was available.

4 NAATI’s interest in implementing language proficiency testing as a requirement for certain categories of applicant, rather than on a voluntary basis, only emerged after we had completed our previous reports.

Page 17: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

17

Draft proposal for custom-built test

Feedback on the draft specifications and sample tasks for this test was gleaned from the group

discussion and the forms collected from all 17 participants.

All respondents believed that the proposed specifications and sample tasks were either appropriate

(13) or partly appropriate (4) for the intended screening purpose. The latter group suggested the

following modifications or reservations:

Section 1 should have more room to include additional issues. Because it is their native

(LOTE) higher level may be required.

Some of the terminologies may not be appropriate with LOTE. The LOTE screening test should

be established by LOTE teachers.

The responding to a written task (Task 4) needs to be split into 2 (one for interpreters, the

other for translators). The writing level needs to be more complex given that Professional

Translator language assessment screening material needs to be at a higher level.

In response to the question “Should all tasks be taken by candidates regardless of profession or

should there be different configurations for interpreter and translator applicants?’ opinions varied

somewhat but the majority felt that while most tasks would be suitable for both professions, some

profession-specific branching would be desirable, at least for translation purposes where more

sophisticated writing skills are needed.

All participants favoured the idea of attending a further workshop in language-specific groups for the

purpose of developing specific tasks to accompany the final test specification.

Summary discussion of LOTE testing options

A discussion held at the end of the workshop served to elicit feedback on the idea of making a range

of options available for candidates from different language groups. Participants were generally

supportive of this idea, emphasizing that even imperfect solutions to the LOTE screening problem

were preferable to the status quo, where candidates often risked embarking on the interpreter or

translator tests without any sense of whether they had the requisite language skills to succeed.

Dialang was favoured as appropriate for self-access screening for those languages (Danish, Dutch,

Finnish, French, German, Greek, Icelandic, Irish-gaelic, Italian, Norwegian, Portuguese, Spanish and

Swedish) in which it is offered. As a follow-up for those scoring at an appropriate CEFR level on the

Dialang, or as an alternative for those working in languages not covered by Dialang, it was accepted

that ACTFL OPIc for interpreters working in: Arabic, Bengali, English, French, German, Italian,

Indonesian, Japanese, Korean, Pashto, Persian Farsi, Portuguese, Russian, Spanish or Tagalog) and

ACTFL WPT (for translators working to and from: Arabic, Chinese-Cantonese, Chinese-Mandarin,

Danish, French, German Greek, Hindi, Italian, Japanese, Korean, Norwegian, Polish, Portuguese,

Russian, Spanish, Turkish or Vietnamese) might be appropriate options. It was noted however that

only the ACTL OPIc and ACTFL WPT (and not Dialang) would be suitable should the testing be

required rather than made available on a voluntary self-access basis. In these circumstances the

Page 18: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

18

slightly more “certified” (or double rated) version of the relevant test would have to be used and

administered under proctored conditions,as indicated in our previous report (Elder et al. 2016b).

The idea of a custom test developed and administered by NAATI panels on a cost recovery basis in

languages for which no suitable alternatives are available was highly favoured, as already noted

above. Some issues relating to administration of the custom test were raised during the summary

discussion, including the question of whether the examiner would be anonymous (apparently a

principle observed for the NAATI T&I accreditation tests). Our response was that when the test was

used for low stakes purposes (i.e., serving merely as guidance for the candidate), preserving the

anonymity of the examiner should not be a matter for concern. It was proposed by one participant

that the test could be administered by phone with the candidate attending a local NAATI Centre, a

proposition that may be worth considering for logistic reasons.

4.4 Workshop evaluation A final questionnaire eliciting feedback on the workshop as a whole revealed a high level of

satisfaction among participants. Fifteen of the 17 respondents reported feeling comfortable (10) or

very comfortable (7) with the proficiency standards set on Day One of the workshop and only two

were undecided, rating their comfort level at 3 on a scale of 1-5.

The workshop was deemed to be helpful (4) or very helpful (12) in raising awareness of issues

associated with language proficiency screening by all but one of the participants. The majority (15)

of participants also expressed satisfaction with the opportunities provided for reviewing the

different test options. When commenting on the most valuable aspect of the workshop participants

mentioned: a) familiarizing themselves with test options, especially being able to try them out for

themselves; b) learning about the CEFR and about the standard setting process; c) opportunities for

discussion and exchange of information and opinions with colleagues and language testing experts;

and d) the friendly environment.

Very few participants responded to the question about less valuable aspects of the workshop but it

is perhaps worth noting, by way of feedback for NAATI, an expression of concern by one participant

about the lack of policy parameters regarding the optional nature of the testing and uncertainty over

the appropriate level for ‘recognised’ languages.

5. Conclusions and Recommendations Insights and feedback gleaned from the workshop participants, as outlined above, have assisted us

in formulating a number of recommendations for consideration by NAATI. It should be noted that

some of these constitute a refinement of what was proposed in our previous reports. Both the new

and revised recommendations are set out below.

It is recommended that:

1. the following CEFR levels proposed by the workshop participants be tentatively adopted by NAATI

as indicating the minimally acceptable standards for each profession in both English and LOTE.

Provisionally certified interpreter Certified interpreter

Listening B1+ C1

Speaking B2 C1

Page 19: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

19

Reading B2 C1

Writing B1 B2

Recognised translator Certified translator

Listening - -

Speaking - -

Reading B2 C1

Writing B2 C1

2. the above minimum standards should be used as a basis for determining cut-scores or thresholds

for satisfactory performance on the various test options (see below) that are proposed for screening.

3. the above levels /minimum standards be reviewed over time and if necessary adjusted with

reference to a) exit proficiency levels achieved by students completing NAATI approved courses and

b) evidence comparing CEFR levels of candidates (as reflected in results on the language proficiency

tests recommended below) with subsequent performance on the relevant NAATI tests5.

4. the following tests be adopted for screening of candidates’ English proficiency, giving due

consideration to the status of testing - whether on a voluntary basis for the purpose of guiding NAATI

applicants as to their readiness to take the relevant NAATI test, or as a compulsory requirement (for

Recognition purposes).

For interpreters:

Voluntary screening Compulsory screening

Dialang English (all tasks)

AND (for those performing close to or above the minimum CEFR standard on Dialang) Versant English (speaking) (self-access)

Versant English (proctored by NAATI)

OR (on request) Custom test (NAATI administered and rated)

OR (on request) Custom test (NAATI administered and rated)

For translators:

5 Particular attention should be paid to monitoring the appropriate level for writing for translators as the levels set during

the review of the CEFR descriptors, as noted under Findings above, did not corroborate the findings of the writing samples

review for these professions.

Page 20: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

20

Voluntary screening Compulsory screening

Dialang English (all tasks)

AND (for those performing close to or above the minimum CEFR standard on Dialang) Versant Writing (self-access)

Versant Writing (proctored by NAATI)

OR (on request) Custom English test (administered & rated by NAATI panel)

OR (on request) Custom English test (administered & rated by NAATI panel).

5. the following tests be adopted for screening of candidates’ proficiency in LOTE, giving due

consideration to the status of testing - whether on a voluntary basis for the purpose of guiding NAATI

applicants as to their readiness to take the relevant NAATI test, or as a compulsory requirement (for

Recognition purposes).

Page 21: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

21

For interpreters:

Voluntary screening Compulsory screening

Dialang (all tasks) (for Danish, Dutch, Finnish, French, German, Greek, Icelandic, Irish-gaelic, Italian, Norwegian, Portuguese, Spanish and Swedish)

AND/OR Versant (Speaking) (for Arabic, Chinese, Dutch, French and Spanish)

Versant Speaking (proctored) (for Arabic, Chinese, Dutch, French, Spanish)

OR ACTFL (OPIc) (for Arabic, Bengali, English, French, German, Italian, Indonesian, Japanese, Korean, Pashto, Persian Farsi, Portuguese, Russian, Spanish and Tagalog)

OR ACTFL OPIc (certification version) (for Arabic, Bengali, English, French, German, Italian, Indonesian, Japanese, Korean, Pashto, Persian Farsi, Portuguese, Russian, Spanish and Tagalog)

OR Custom test (NAATI administered and rated) (for all other languages)

OR Custom test (NAATI administered and rated) (for all other languages)

For translators:

Voluntary screening Compulsory screening

Dialang (all tasks) (for Danish, Dutch, Finnish, French, German, Greek, Icelandic, Irish-gaelic, Italian, Norwegian, Portuguese, Spanish and Swedish) if language is available

AND/OR ACTFL (WPT) (for Arabic, Chinese-

Cantonese, Chinese-Mandarin, Danish,

French, German Greek, Hindi, Italian,

Japanese, Korean, Norwegian, Polish,

Portuguese, Russian Spanish, Turkish and

Vietnamese)

ACTFL WPT (certification version)

(for Arabic, Chinese-Cantonese,

Chinese-Mandarin, Danish,

French, German Greek, Hindi,

Italian, Japanese, Korean,

Norwegian, Polish, Portuguese,

Russian Spanish, Turkish and

Vietnamese)

OR Custom test (NAATI administered and rated) (for all other languages)

OR Custom test (NAATI administered and rated) (for all other languages)

Page 22: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

22

6. where English or LOTE screening is implemented as a requirement, the recommended tests

should be administered under proctored conditions so that NAATI can ascertain the identity of

candidates, directly access their test results and have confidence in their integrity.

7. to develop the custom tests mentioned above, NAATI commission a project to refine the draft

specifications and sample tasks used for the current workshop so that a) appropriate adjustments

can be made to reflect the different needs of the interpreter and translator professions, b) trialling

can be undertaken and c) suitable training materials can be developed for test implementation.

8. a series of workshops for NAATI examining panels be convened for the purpose of a) familiarizing

participants with the test specifications and test tasks, b) developing alternative versions of these

tasks as required, c) briefing participants in procedures for administering the custom tests to the

relevant language groups and d) training them in the use of a common CEFR-linked language

proficiency rating scale to score performance on these tasks. This process should begin with English

and then proceed to LOTEs in high demand where no other screening test option is available.

9. NAATI clearly articulate a policy in relation to both the minimally acceptable proficiency standards

in English and LOTE for the interpreter and translator professions and the language proficiency

screening tests which it proposes as a means of gauging applicants’ status in relation to these

standards.

10. this policy be communicated to stakeholders through the creation of a website with language-

specific links giving information about the recommended language screening options for each

language, the manner in which the relevant tests can be accessed by candidates, how results can be

interpreted and what actions are advised in relation to these results.

Page 23: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

23

6. References Cizek, G. J., & Bunch, M. B. (2007). Standard setting: A guide to establishing and evaluating performance standards on tests. Thousand Oaks, CA: Sage. Elder, C., Knoch, U., and Kim, H. (2016a). Preparing for the NAATI examination: options and issues for English proficiency screening. Final Report to the National Accreditation Authority for Translators and Interpreters (NAATI). Language Testing Research Centre, University of Melbourne. Elder, C., Knoch, U., and Kim, H. (2016b). Preliminary LOTE proficiency screening of candidates seeking accreditation as interpreters and translators: A feasibility study. Final Report to the National Accreditation Authority for Translators and Interpreters (NAATI).

Hale, S., Garcia, I., Hlavac, J., Kim, M., Lai, M., Turner, B., and Slatyer, H. (2012). Improvements to NAATI testing: Development of a conceptual overview for a new model for NAATI standards, testing and assessment (Project Ref. RG114318). The University of New South Wales.

Page 24: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

24

7. Appendices

Appendix A. List of participants from the interpreter and translator

professions

<withheld by NAATI>

Page 25: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

25

Appendix B. Pre-workshop task Thank you for agreeing to participate in the forthcoming workshop to determine language proficiency standards for the Translating and Interpreting professions and to consider issues and options for screening proficiency in English and LOTE. The workshop will be led by language testing experts Dr. Ute Knoch and Ass. Prof. Catherine Elder from the Language Testing Research Centre at the University of Melbourne. Before you come to the workshop, please take a little time to prepare brief responses to the following questions: List some of the key tasks and activities involved in doing bilingual work in the interpreting profession. What would be the minimally acceptable listening and speaking skills needed to perform these tasks adequately at a) the paraprofessional (or provisionally certified) level and b) the professional (or certified) level. Are there are other areas of language skill and/or language knowledge required for interpreting purposes other than listening and speaking? List some of the key tasks and activities involved in doing bilingual work in the translating profession. What would be the minimally acceptable reading and writing skills needed to perform these tasks adequately? Are there are other areas of language skill and/or language knowledge required for translating purposes other than reading and writing? There will be an opportunity to discuss your responses with others during Day One of the workshop.

Page 26: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

26

Appendix C. CEFR overall descriptors for Listening, Speaking, Reading and

Writing

Page 27: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

27

Page 28: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

28

Page 29: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

29

Appendix D. Versant evaluation form

1. How comfortable were you with the test-taking process?

(Not at all) 1...............2.............3.................4................5 (Very) (Circle a number)

2. How familiar were you with the item types on the test?

(Not at all) 1...............2.............3.................4................5 (Very) (Circle a number)

3. How adequately do these items measure speaking ability?

(Not at all) 1...............2.............3.................4................5 (Very) (Circle a number)

4. How adequately do these items measure listening ability?

(Not at all) 1...............2.............3.................4................5 (Very) (Circle a number)

5. How confident were you of the accuracy of the score you received?

(Not at all) 1...............2.............3.................4................5 (Very) (Circle a number)

6. How useful was the score report?

(Not at all) 1...............2.............3.................4................5 (Very) (Circle a number)

7. All things considered (i.e. your own impressions plus available information re the test’s validity, reliability etc.), how would you rate the test’s suitability for the intended screening purpose?

(Low) 1...............2.............3.................4................5 (High) (Circle a number)

Give reasons for your rating

Name_____________________________________________________________________________________

_

Page 30: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

30

Appendix E. Standard-setting results Listening – CEFR descriptor review

Provisionally certified interpreter Certified interpreter English LOTE English LOTE

A2 1 - - - B1 2 4 - - B1+ 8 6 - - B2 5 5 3 4 B2+ 1 - 2 1 C1 - 2 12 9 C2 - - - 1 Agreed

standard B1+ B1+ C1 C1

Speaking – CEFR descriptor review

Provisionally certified interpreter Certified interpreter English LOTE English LOTE

A2 - - - - B1 3 4 - - B1+ 4 2 - - B2 8 7 2 3 B2+ - - 3 1 C1 - 2 11 12 C2 - - - - Agreed

standard B2 B2 C1 C1

Reading – CEFR descriptor review

Provisionally certified interpreter Certified interpreter English LOTE English LOTE

A2 - - - - B1 2 2 - - B1+ 4 4 - - B2 11 11 - - B2+ - - 5 5 C1 - - 12 12 C2 - - - - Agreed

standard B2 B2 C1 C1

Recognized Translator Certified translator English LOTE English LOTE

A2 1 1 - - B1 3 3 - - B1+ 1 1 - - B2 9 8 1 1 B2+ - - - - C1 1 1 15 14 C2 - - 1 2 Agreed standard

B2 B2 C1 C1

Page 31: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

31

Writing – CEFR descriptor review

Provisionally certified interpreter Certified interpreter English LOTE English LOTE

A2 3 2 - - B1 10 11 3 3 B2 1 1 7 6 C1 - - 4 4 C2 - - - - Agreed

standard B1 B1 B2 B2

Recognized Translator Certified translator English LOTE English LOTE

A2 - - - - B1 10 9 - - B2 2 3 - - C1 1 1 12 12 C2 - - 3 3 Agreed standard

B1 B1 C1 C1

Sample review – Writing (English only)

Provisionally certified interpreter Certified interpreter Yes No Yes No

A2 3 9 2 10 Low B1 6 6 3 9 B1 8 4 2 10 B2 12 0 9 3 C1 12 0 12 0 C2 11 1 12 0 Agreed standard B1 B2

Sample review – Writing (English only)

Recognized translator Certified translator Yes No Yes No

A2 2 13 0 16 Low B1 5 9 0 16 B1 4 11 0 16 B2 14 1 10 6 C1 14 1 15 1 C2 12 3 8 8 Agreed standard B2 B2

Speaking – sample review (English only)

Provisionally certified interpreter Certified interpreter Yes No Yes No

B1 7 10 0 17 B2 15 2 3 14 C1 17 0 4 13 C2 15 2 12 2 Agreed standard B2 C2

Page 32: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

32

Appendix F. Standard-setting evaluation form

1. How well did you understand the process of setting standards? (Circle a number)

(Not at all) 1...............2.............3.................4................5 (Very)

2. How effective was this process for determining minimally acceptable standards for T & I purposes?

(Not at all) 1...............2.............3.................4................5 (Very) (Circle a number)

3. How helpful were the CEFR descriptors for making your decision about appropriate levels?

(Not at all) 1...............2.............3.................4................5 (Very) (Circle a number)

4. How helpful were the speech and writing samples helpful in firming up your judgements about what constitutes a minimally competent level of proficiency for T & I purposes

Yes / No (Circle one)

5. How confident are you in the final judgement you made for Listening?

(Not at all) 1...............2.............3.................4................5 (Very) (Circle a number)

6. How confident are you in the final judgement you made for Speaking?

(Not at all) 1...............2.............3.................4................5 (Very) (Circle a number)

7. How confident are you in the final judgement you made for Reading?

(Not at all) 1...............2.............3.................4................5 (Very) (Circle a number)

8. How confident are you in the final judgement you made for Writing?

(Not at all) 1...............2.............3.................4................5 (Very) (Circle a number)

Would like to make any other comments?

Page 33: DETERMINING AND MONITORING THE LANGUAGE PROFICENCY … · language proficiency screening and identified a number of tests which might be used.1. The reports also highlighted the need

33

Appendix G. Workshop evaluation form

1. How comfortable are you with the minimum proficiency standards set out this workshop? (Circle a number)

(Not at all) 1...............2.............3.................4................5 (Very) (Circle a number)

2. How helpful was the workshop in raising your awareness of issues associated with language proficiency screening?

(Not at all) 1...............2.............3.................4................5 (Very) (Circle a number)

3. How satisfied were you with the opportunities provided for reviewing the different test options?

(Not at all) 1...............2.............3.................4................5 (Very) (Circle a number)

4. What was the most valuable aspect of the workshop?

What was the least valuable aspect of the workshop?

6. Would like to make any other comments?