Print

Print


*Placement Testing Query*

Let me say from the start that, given Gordonstoun’s unique status, I
believe that you would do best to develop your own screening/placement test
for your stated purpose [to establish provisional CEFR levels for your
learners for administrative purposes]. The rest of this document tries to
explain this.

Now here are some thoughts about placement testing.

There are four basic possible purposes in ESOL placement testing:

·         To assess the potential of individuals to learn the target
language

·         To identify what individual learners may be able to do [or may
not be able to do]

·         To match learners with appropriate courses [either ESOL courses
or other curriculum areas]

·         To predict the probable future performance of individual learners

An on-line placement test could certainly satisfy each of these purposes
but I am not convinced that any given test would necessarily be able to
cater for all four and anyway I am not sure that any of these purposes are
actually significant in your case. On the face of it, you just want to
establish a general CEFR Level for internal school admissions
administration and visa application purposes [and the latter might
eventually require a more rigorous SELT anyway]. I am not sure how just
establishing a CEFR level would relate to each of the purposes [except
perhaps for the second purpose by reference to EAQUALS, which would be
time-consuming and goes beyond your remit]. Anyway, it seems that all you
actually want is a screening test that helps you identify CEFR levels [for
whatever purpose you have in mind]. The good news is that this can be done
quite simply by using what I call a quick and dirty test.

From the information you have supplied, I am not entirely clear why you
really need a placement test at Gordonstoun. Clearly you seek a screening
test to establish the levels of your target learners but what are you going
to do with this information? Are you actually going to place them in
separate teaching groups for ESOL? What is your target exit proficiency
level? Do you really need a diagnostic test related to CEFR rather than a
placement test? If you want the latter then you will definitely have to
devise your own, which would probably take a couple of years at least, so
let us stick to a quick and dirty placement test.

My own view is that teachers often become over-concerned about CEFR levels.
As I observed some three years ago in the ESOL-Research discussion forum, the
CFER often seems over-used as an immediate and convenient <go-to> default
reference for busy teachers. The basic problem as far as I can see is that
teachers tend to seek a comfort blanket. They want to be able to classify,
categorise and pigeon-hole … and worse to stereotype. They want to have
something to cling to and cite as a support and justification for their
activities because they are not quite sure of the efficacy of these
activities. The result of this craving is that an available instrument that
was moderate and effective within its original parameters [the CEFR] has
been stretched to breaking-point. So my first advice to you is not to worry
too much about CEFR [unless your management is holding you to account ...
or worse still telling you to tie your testing to the CEFR]. But if CEFR is
what you want, so be it.

This leads me on to my next point: you should not give a test unless you
know why you are giving it. This is not new advice. It was given by Charles
Alderson over thirty years ago. As he pointed out, you need to be clear
about what exactly you want to test and as far as possible ensure your test
measures only that and not irrelevant abilities or knowledge. Charles
Alderson noted that there is little or no point in giving a test if you do
not need [or will not use] the information that it provides and there is
also no point in giving a test if you cannot interpret it or if you are not
willing to believe the results if they are contrary to your expectations.
Such considerations will inevitably influence your choice of
screening/placement test. Again, in my opinion, a quick and dirty test will
suffice for your apparent needs.

Another thing that is not clear is when you hope to use the placement test
in your institution. Ask yourself when placement tests are normally used
during a TESOL course. Most of the placement tests on the market are
intended for use before the start of a course. Is your test pre-entry or
post-entry? Sometimes in-house placement tests are used at the start of a
course to check placement [on the first day or in the first week or to
screen new learners as and when they enter]. Less commonly, placement tests
can be used during a course or even at the end of the course, although I
cannot see the point of doing this when you will [presumably] already have
ongoing progress tests or achievement tests of some sort all providing more
accurate data than a placement test. An ongoing programme of testing for
CEFR levels would probably best be completed through in-house course-based
progress tests, or external commercial achievement or proficiency tests.
You would not necessarily need a separate test for CEFR levels. Probably
you have in mind a rolling use to quickly check on individual CEFR levels
throughout the course on multiple occasions; this would be compatible with
on-line testing based on a quick placement test.

I have mentioned quick and dirty placement testing and obviously I think
that this would be your best option. As a bonus, if you also want to be
able to demonstrate to concerned parties [management; parents; learners;
and inspectors] that you have screened learners for CEFR levels then I
think that all you need is a cheap, quick and dirty on-line placement test
[DOLPT]. My point is that you do not need a massively sophisticated test
for your purpose. I outline some off-the-shelf choices below but my view is
that you would do better to develop your own quick and dirty in-house
on-line test [especially since you express a desire for something sensitive
to cultural and religious backgrounds].

I suggest that a cheap and quick DOLPT will have the following
characteristics:

·         It will be quick to complete [taking 30 minutes to 60 minutes
maximum].

·         It will consist entirely of objective test items [particularly
multiple-choice but perhaps cloze].

·         It should consist of at least 40 to 80 items, directly or
indirectly sampling various aspects of language [not just grammar]

·         Its language focus will be entirely on grammar and vocabulary
[the assumption is that the four skills will be indirectly sampled
adequately by the underlying grasp of grammar and vocabulary.

·         It will be simple to score and quickly establish CEFR level with
sufficient reliability [that is the same user with multiple attempts should
achieve more or less same result each time].

·         It has a low cost per administration [but note that few
worthwhile commercial OLPTs are free: the Oxford Online Placement Test, for
example, costs GBP 4.29 per candidate].

·         It directly relates to an established set of levels [in this
instance CEFR].

You can have listening, reading and grammar sections in a placement test
but these are not necessary for a DOLPT. I cannot see the point of such
sections for your simple and singular purpose. They will only cause the
screening test to take longer to complete than necessary. Ultimately you
will have to decide whether it is worth it when all you want is an
indication of CEFR Level. If you include a writing section as well as a
reading section and as well as a listening section and as well as a
speaking section, the whole test may take up to three hours, all just to
establish a simple CEFR level. I would characterise that as a waste of time
and energy. For the data you seek, I think a quick DOLPT will do the job. I
would, however, avoid tests that claim to complete in much less than 30
minutes: they are unlikely to be reliably discriminative and are also
unlikely to be sufficiently valid across the target range of CEFR levels.

For a DOLPT you do not need any subjective test items [such as written
composition, short answers, or conversational English] and you do not need
any specific reading and writing items or any specific speaking and
listening items. Questionnaires and self-reporting rating scales have no
place in a DOLPT [although useful at other times]. Remember KISS is key.

There are a number of commercial online placement tests available [and many
of these are little more than computerised pen and paper tests]. Off the
shelf OLPTs have the advantage of availability and convenience and many are
based on a substantial and comprehensive data set. These days they also
often have the [possible] advantage of being computer-adaptive tests [CAT]
although this is unlikely to be a crucial consideration for a single and
isolated user such as Gordonstoun. It was important in some places I have
worked in where there are genuine problems with cheating and collusion in
large-scale administrations of screening and placement tests but in
institutions such as yours all you probably need are a few different
versions of the same test on-line [in an intranet]. You probably do not
need to buy into a commercial CAT internet package.

Another problem with off-the-shelf OLPTs is that they may not sufficiently
meet your criteria of sensitivity to cultural and religious backgrounds
[and perhaps to age and gender backgrounds]. DIY OLPTs can be much more
sensitive to specific cultural and religious backgrounds: after all, you do
your own editing of items.

Here is another caveat for you: the fact a commercial test claims to be
referenced to the CEFR Levels does not tell you what the test actually
consists of in the sense of what items are covered in it. There are many
tests [and textbooks and materials] all claiming to be referenced to the
CEFR. Crucially no publisher offers the same solution, the same underlying
description of language. Many if not most commercial providers of OLPTs are
trying to entice their customers to sign up for their specific commercial
ESOL courses [and in some instances a whole package of accompanying
textbooks and materials]. To sell their commercial products they are
obliged to differentiate their products from the essentially similar
products of all the other commercial competitors. *Caveat emptor*.

Placement tests [because they are essentially low stakes tests] may not
directly test the four skills and in fact can comfortably get away with
doing so. And as I have suggested may do their job better by doing so.

I have come across several on-line English Language placement tests that
you might want to look at [some may be out of date: I last surveyed the
field over 5 years ago]:

One:     Placement tests from commercial ESOL course providers

British Council: Level Check
<http://www.britishcouncil.jp/en/english/register/level-check>[free on-line
but there are also pay-for BC tests]

British Council International Language Assessment [requires individual
registration]

English First: Language Test Online <http://www.ef.co.uk/test/#/>
[screening test: requires individual registration]

International House: Test your English Language Level
<http://ihworld.com/online-test/level_test> [requires individual
registration]

Versant English Placement Test [covers speaking, listening, reading, and
writing skills in only about 50 minutes with almost instant feedback]

Wall Street English placement test

http://englishenglish.com/englishtest.htm

Something that might well interest you is the English Placement Test
Platform that enables you to create the English placement test that you
need [http://englishplacement.com/]: it enables you to set up your own OLPT
predicated on the CEFR for no cost and at only a low cost per test.

Most OLPTs from commercial ESOL providers cost money and require an
internet connection.

Two:     Placement tests from commercial examination organisations

Cambridge English Placement Test: a single on-line test covering reading
and listening skills as well as language knowledge

CRELLA Password Placement Test [a CAT Test of English Language knowledge
with 72 items lasting about 60 minutes]

Oxford Online Placement Test

Pearson English interactive placement test

Pearson LCCI placement test

These tests all cost money per entry but are probably the most neutral
[except they are typically tied to publishers and their course materials]:
note, for example, that the Oxford Online Placement Test is predicated on
grammar and vocabulary included in Oxford University Press ELT coursebooks.

Two other pen-and-paper placement tests from an earlier era were the NQCT
[Nelson Quickcheck Test] that consisted of a series of 15-minute tests [for
a maximum of 60 minutes] and the NELT [Nelson English Language Test] which
was a battery of ten tests each containing 50 four-choice MCT items
[covering grammar and vocabulary only] that would take about 4 hours to
complete.

Three:   Placement tests from commercial course book publishers

This is a less obvious source of placement test materials [especially
OLPTs]; the major publishers market integrated all-skills course books with
texts targeting successive CEFR levels, and these usually come with some
sort of placement tests. However, these are usually predicated on the
course-book and might only be of limited interest to you unless you use one
of the standard texts at Gordonstoun [Cutting Edge; Face2Face; Headway/New
Headway; New English File and so on].

Commercial interests dominate all these tests: never forget that it is [for
example] Pearson’s avowed aim to dominate the ESOL industry globally, so
that is what you would be buying into.

In your situation, you can forget about any placement/screening tests from
the USA [TOEFL, MTELP, MELAB, and CELT]. You can also forget about tests
such as IELTS and PTE.

I was interested to learn the reasons for the insistence on CEFR. I fully
understand your position and I appreciate that this is a managerial problem
rather than an educational one. Your problem clearly is going to be
persuading Gordonstoun Admissions that whatever test you settle on is going
provide them with a straightforward indication of whether a student has the
potential to cope with the demands of studying in English [with or without
EAL support]. I do not think that the choice of test is going to be a huge
problem.

The problem is going to be persuading admissions that the following applies
[and this is just a tentative outline modified from IELTS documentation: it
is largely off the top of my head and you may well have different more
nuanced views in mind]. Basically what you will be suggesting is that CEFR
B1 is a basic prerequisite for Admissions [and CEFR B2 is more desirable].

First consider the following table [which is actually an extension of data
in IELTS materials: think of it as a touch of creative reverse
engineering]. It shows what I suspect are the sort of levels of linguistic
demands in the curriculum at your institution.

*CEFR*

*Linguistically demanding courses*

*Linguistically less demanding courses*

C2

Acceptable [no EAL support needed]

Acceptable [no EAL support needed]

C1

Acceptable [no EAL support needed]

Acceptable [no EAL support needed]

B2

Probably acceptable [with some EAL support]

Probably acceptable without EAL support

B1

EAL support needed alongside course

EAL support needed alongside course

A2

EAL instruction essential before course

EAL instruction needed before course

A1

EAL instruction essential before course

EAL instruction essential before course

It is most difficult to establish what is meant by <linguistically
demanding> courses and <linguistically less demanding> [and there is
certainly no <magic formula>]. I do not think you will have many if any
courses that demand C2 proficiency but you might have a few that entail
some C1 proficiency at the top end. In my experience, even at B2 level,
some additional English studies may be needed. What you [and then
admissions] need to do is look at the Gordonstoun curriculum and then
decide which courses are linguistically demanding and which are
linguistically less demanding. What I suggest above is largely off the top
of my head and is certainly open to modification but it does put across the
principle [and provides a starting point for you].

My next sentence is slightly tongue in the cheek but it will get your
Admissions nodding. Just as all overseas students entering a university
must provide evidence that they can use English well enough to study
effectively at the university, so Gordonstoun must seek such evidence for
entry. You make the point that your understanding from Admissions is that
CEFR B1 [basically represented by IELTS 4.5 or equivalent Trinity GESE or
Trinity ISE] can be requested when the school sponsors a new student for a
visa [presumably a Tier 4 General Student visa below degree level]. In
other words entrants need Cambridge PET or the like. I think you are
correct to use CEFR B1 as shorthand for staff unfamiliar with ELT as to the
approximate language level of an incoming student [although, in my
experience, this is not always readily understood by a non-ESOL
specialist]. As you say, it can be mapped against IELTS and Cambridge
examinations [and allegedly, and tenuously I suspect, against GCSE
English]. I see no reason for you not to stick with this. The relative
levels of GCSE and A-Level English are measured on the UK National
Qualification Framework. The NQF scale broadly corresponds to the levels of
the CEFR, so some broad comparison between the levels of examinations is
possible within your range at Gordonstoun.

*CEFR*

*GCSE/A-Level [variously]*

C1

A-Level [A2]

A-Level [A2]



B2

GCSE higher grades [6 to 9]

AS-Level

A-Level [A2]

B1

GCSE higher grades [4 to 5]

GCSE higher grades [5 to 9]

GCSE higher grades [5 to 9]

A2

GCSE lower grades [1 to 3]

GCSE lower grades [1 to 4]

GCSE lower grades [1 to 4]

The consensus on this seems to be patchy: I have come across all three of
the above positions [although I favour the first column and feel the others
may be optimistic]. But you will have your own views on this.

Although I think it is important for the above points to be understood,
discussed and agreed, in some respects the most important consideration is
that contained in your original query. This is how you establish a nominal
CEFR level with a minimum of effort and expense.

As I indicated above establishing a nominal CEFR level is not [in my view]
very difficult: there are dozens of OLPT tests available, with largely
cosmetic differences, none of which are really any better [or any worse]
than any other. After all, you are not really seeking a placement test: you
just seek a valid and reliable screening test that will identify a general
CEFR level.

The practicability of your test is important: you want to obtain your
target data with a minimum of effort and expense; you are concerned with
the general simplicity and convenience of use of the test instrument. This
is where KISS comes in. You are also concerned with temporal qualities: the
time it takes to construct the test; the time it takes to administer the
test; the time it takes to score the test; and the time it takes to
interpret its results. An in-house DOLPT will cater for all these aspects.

You are after all only sampling a narrow aspect of TESOL: all you want to
obtain is a short and simple CEFR level identifier [such as <CEFR B1>]. I
fail to see why you should have to waste money on this by using a
commercial [the clue is in the adjective] OLPT and help keep somebody
somewhere in employment. Commercial OLPTs set out to make a profit at your
expense. Most seem to work out at about GBP 5 a throw. Why waste money on
it when you can produce a perfectly adequate in-house test that costs
nothing. Sometimes it seems to me that suppliers of OLPTs are just preying
on the insecurities of teachers and management and offering them a chance
to delegate responsibility for statements about CEFR [a comfort blanket].
Ask yourself why are there are so many OLPTs on the market. It is because
they are easy enough to write once you understand the CEFR and have a basic
grasp of testing. Users are really just buying assurance/reassurance [my
<comfort blanket>] rather than esoteric expertise.

So I am suggesting that you can easily design your own OLPT for Gordonstoun
and I would strongly encourage you to do so. All you seek is a simple CEFR
level identifier. Moreover, Management is breathing down your neck: Admissions
feel the paperwork is proving onerous and would now like to move online.
They obviously want to KISS. So help them out.

You have already indicated that the main decision Admissions need to make
is whether a student may be able to cope with the demands of studying in
English [with support for English if need be]. My next sentence is crucial.
A multiple-choice test of grammar and vocabulary [and the two aspects are
actually very difficult to separate] can be a sufficient indicator on its
own of a learner’s ability to cope with the language demands made on
learners by English medium study. This really is all you are trying to find
out.

I am extremely glad to see that you have reservations about commercial
OLPTs [and I hope that I have manged to add to these]. My view is that an
in-house OLPT will send out a signal to parents about the professionalism
of staff at Gordonstoun as well as the other factors you mention.  My point
is that you should not have to rely on the advice of <testing experts> and
test providers: you can be even more expert than they are within your own
context. After all, you are a teacher. And so am I [I am not a <academic>].

It is interesting that your Admissions section receives a wide range of
school reports from around the globe and that you report that some from
China are particularly sparse. This is not unexpected and the problem for
Admissions [and for you] is the comparability of all these tests. Even if
some agents provide their own screening tests, comparability will be a
problem [some you can trust; others you cannot] and, as you note, there is
no real consistency across the board. This, for me, is yet another reason
for an in-house OLPT.

By the way, consider the point that the perceived need for an OLPT may
represent jumping on a band wagon, one of the latest buzz-words. You should
not do not automatically assume that an OLPT has tremendous and
overwhelming advantages for you in your situation. On-line testing is
merely a technological device and the playground of geeks: you still have
the problem of GIGO [and this is the real elephant in the room]. There are
ways to make screening/placement tests partly adaptive or semi-adaptive
with just as useful outcomes [I will explain this point in a subsequent
email if you wish].

You write that so far you have trialled the Clarity English online
placement test for Admissions, but have reservations. You tell me that they
are now asking about the Oxford and Cambridge tests, but to you all these
seem to have a <language school> feel about them. I take your point that all
these commercial tests seem to have a <language school> feel about them
[and this might just be a matter of face validity]. This is perhaps
inevitable given the testing target [CEFR levelling] and given that
language schools are probably their bread-and-butter market. All I would
ask is whether this matters that much: you either live with it or DIY.

I was not familiar with the Clarity English OLPT but have had a look at it
and I can see nothing that marks it out from the rest of the pack. The
company is based in Hong Kong [which might be relevant to you in Scotland]
but it seems almost churlish to point out that CEFR is based in Strasbourg.
The Clarity English website consists mainly of marketing hype [written by
ICT specialists rather than ESOL practitioners] that says very little about
the test in rather too many words; consequently, I would not be inclined to
give it the time of day. As a test it is certainly no better than the more
established Oxford placement test and the Cambridge test [both of which
seem to involve a greater ESOL testing input that some of the
ICT-technician dominated smaller providers].

In my view all these commercial OLPTs share the same advantages and
disadvantages. Most also have some cultural bias and whilst retaining their
focus on CEFR it sometimes becomes clear that CEFR itself does not always
travel well; it seems to have limitations in geographical <reach> outside
its European base. Most OLPTs from commercial ESOL providers also cost
money and require an internet connection. I am not sure that Gordonstoun
Admissions would care to tie their systems into an external [and to a
potentially unreliable] internet OLPT provider. Apart from the Oxford test
and the Cambridge test they are mainly small-scale commercial enterprises
ruled by accountants and they can go bust. I would not trust many of them
further than I could throw them. What would you do if your test provider
went bust?

I still think that it is best for you to produce your own OLPT [although
this is a more demanding path than an instant off-the-shelf solution]. I
have already mentioned something that might well interest you in this
respect is the English Placement Test Platform [http://englishplacement.com/
]:that enables you to create the English placement test that you need: it
claims to enable you to set up your own OLPT predicated on the CEFR for no
cost and at only a low cost per test. I am not familiar with this site but
it is probably worth exploring [and I would be interested for any feedback
on this].

I am going to finish this email with a little academic theory. My position
on DIY relates closely to what Barry O’Sullivan [2011] refers to as
professionalisation and localisation. The two are related: the process of
professionalisation leads to a process of test localisation. In Barry’s
view the latter process is an important development in language testing. It
is predicated on the assumption that in most circumstances, localised tests
are more likely to allow teachers and other test users to make more valid
assumptions about test takers than non-localised tests. Barry notes the
fact that tests developed for use in a specific context with a clearly
defined candidature [such as yours at Gordonstoun] should take into account
those meaningful parameters associated with both when developing any test
and scoring system. A new level of professionalism is emerging, with
growing levels of expertise that exceed those of their predecessors.

All this implies that a test which is designed for use within a given
context [such as for screening purposes and reassurance at a unique public
school] then it is unlikely that a placement test designed with no specific
candidature in mind [basically a very general opportunity sample: in the
words of Walter Scott: ‘Come one, come all’] will offer an appropriate
measure of the language proficiencies entailed in that context. As Barry
suggests, if internationalised tests are to be used outside of their
original specified domain [often for commercial marketing purposes] then
they really should be validated for this usage. And as Barry points out,
this is rarely, if ever, done.

ATB,

David Thornton

*References*

Green A [2011] *The Password Test – Design, Development And Reliability*
Luton: CRELLA

O’Sullivan B [2011] *Language Testing* Chapter 18 in Simpson J [Ed] *The
Routledge Handbook of Applied Linguistics *Abingdon: Routledge

On Wed, Aug 23, 2017 at 6:20 PM, Helen Turner <[log in to unmask]>
wrote:

> I wonder if anyone has any experience with any of the online placement
> tests available which give a CEFR level for students?
>
>
>
> My students are 11 - 18 years old (mainly in the upper age range though).
> If we choose online testing the tests need to be sensitive to cultural and
> religious backgrounds..
>
>
>
> Any advice greatly appreciated.
>
>
>
> Helen
>
>
>
>
>
> <https://www.facebook.com/GordonstounSchool>
> <https://twitter.com/gordonstoun>
> <https://instagram.com/gordonstounschool>
>
> Gordonstoun Schools Limited, Elgin, Moray, Scotland, IV30 5RF
> Scottish Charity Number SC037867. Company Registered in England No 288105.
> Registered Office c/o Veale Wasbrough Vizards, Solicitors, Barnards Inn, 86
> Fetter Lane, London EC4A 1AD
>
>
> *********************************** ESOL-Research is a forum for
> researchers and practitioners with an interest in research into teaching
> and learning ESOL. ESOL-Research is managed by James Simpson at the Centre
> for Language Education Research, School of Education, University of Leeds.
> To join or leave ESOL-Research, visit http://www.jiscmail.ac.uk/
> lists/ESOL-RESEARCH.html To contact the list owner, send an email to
> [log in to unmask]

***********************************
ESOL-Research is a forum for researchers and practitioners with an interest in research into teaching and learning ESOL. ESOL-Research is managed by James Simpson at the Centre for Language Education Research, School of Education, University of Leeds.
To join or leave ESOL-Research, visit
http://www.jiscmail.ac.uk/lists/ESOL-RESEARCH.html
To contact the list owner, send an email to
[log in to unmask]