Computer-Aided Assessment in Mathematics at Heriot-Watt University © Copyright

Cliff Beevers, on behalf of the CALM group, Heriot-Watt University
cliff@ma.hw.ac.uk

The use of the computer in assessment has been a feature at Heriot-Watt University since 1985 when the CALM Project for Computer Aided Learning in Mathematics began. In those early days we realised quickly, and to our surprise at the time, that students choose freely to take a test to assure themselves of their progress in a course. Indeed, our initial findings revealed that on average students spend twice as long doing tests from the CALM Calculus suite of programs as they do looking at worked examples and trying simulations. The tests involved the use of random parameters to give different numbers in the same type of question and random selection from banks of questions on the topics being tested. The student answered the questions by inputting mathematical expressions which were then marked by comparing these with the true answers held in the computer using a process of evaluation. These tests proved extremely popular with students. Additionally, then, the computer has been employed to provide information to the student as they take tests either in the form of marks, test data like how long did the question take to complete or in the form of advice. Information was also returned to the teacher so that class performance as a whole could be monitored. Some of this early work is reported in [1].

The TLTP Project Mathwise

In 1992 CALM joined leading UK exponents of mathematical computer-aided learning in the Mathwise Project [2] as a principal resource centre responsible for the Authorware modules and an assessment policy. The Mathwise modules have exercises of many types scattered throughout the topics. These types include multiple choice, numerical answers, drag and drop, fill in a gap and responses requiring algebraic expressions. The use of random parameters, which had been pioneered in CALM, was fully exploited in Mathwise.

In 1994 an experiment was set up to show the role of computer assessment in grading student performance in an end-of-course examination. The Mathwise test engine dealt with most of the issues raised by students in an educational evaluation prior to the grading test itself. The three main concerns of the students were

  1. Won’t it be easy to cheat by looking at a neighbouring screen? Random parameters in each question removed this difficulty;
  2. Would the computer understand the mathematical expression entered as the answer? An Input Tool showed the student how the computer was interpreting their answer and this problem also disappeared; and
  3. What about partial credit? Mathwise devised the notion of key steps to help resolve this issue.
Full details on this experiment appear in [3] and further thoughts on the knotty problem of partial credit have been published in [4].

Some Scottish Projects

Over the next few years the CALM team worked on a number of SHEFC projects. MARBLE produced a web-delivered diagnostic test for students at the start of year 1 of a traditional science and engineering service mathematics course. SUMSMAN [5] brought most of the Scottish Universities into collaboration on the preparation of resources to help teach Mathematics. SUMSMAN had several themes but pertinent to this article a number of end-of-module tests were completed for Mathwise core modules with the full cooperation of the Mathwise Executive. Many of these tests have found their way onto the two commercial CDs currently available from the Numerical Algorithm Group at Oxford [6].

Interactive PastPapers

Meanwhile in 1995 CALM was awarded the prestigious Bank of Scotland Award to Higher Education for its innovative teaching. With the prize money further commercial CDs have been created. Known as Interactive PastPapers, this series enhances many of the features of Mathwise, providing excellent revision software for students wishing to practise typical A-Level or GCSE questions [7]. This technology currently provides Heriot-Watt with continuous assessment software to deliver marks for approximately 400 students per year in each of the terms 1, 2 and 3 on its Riccarton campus.

Over the last couple of years CALM has developed a number of strategic alliances with a leading UK Examination Board (University of Cambridge Local Examination Syndicate) and a local software house in West Lothian (EQL). Our technology has migrated to provide both PC and web-delivered tests with many of the features of our earlier products. However, now we have separated the questions from the delivery mechanism so that our questions can be stored as xml files and re-used more flexibly. A question editor accompanies this new suite of software and currently provides the user with the chance to author questions in the style of Interactive PastPapers but with only a fraction of the preparation time needed to author inside a package like Authorware.

What to the Future?

Over the last fifteen years there has been a rapid development of computer-based assessment in all academic subjects. At the start of a new phase in the support infrastructure for academics with the creation of the Learning and Teaching Support Network it is appropriate that we should stop and reflect as a mathematical community to ask what we expect of the next generation of mathematical assessment software.

In an article in the first newsletter of the Learning and Teaching Support Network Centre for Maths, Stats and OR, Greenhow [8] has succinctly described the role of objective testing. He explains the current feature set of the QuestionMark product enumerating the different question types available to the user. He summarises too the question types that may not yet be appropriate for computer delivery. The computer remains an excellent choice for questions that test basic skills and knowledge. The jury is still out on whether it can be adapted and be sufficiently flexible to support questions that seek to test more complex skills and understanding. It has been the philosophy of the CALM group that at the very least some complex skills can be practised using the computer as an automatic prompt and assessor.

It may be that different skills will become important in the 21st century and that the computer presents a better way of assessing these new skills. Whatever happens it seems more sensible for those of us in academia to share resources so we should be looking to write questions to a common standard so that they can be re-purposed in different contexts. How should this be organised?

Perhaps the new LTSN Centre for Mathematics, Statistics and OR will take a lead and set up a mailing list for all those interested in the debate. A discussion of the main issues should be set in motion and early next year a venue be designated to host a conference on the issues of computer-assistedUse of uninitialized value in concatenation (.) or string at E:\listplex\SYSTEM\SCRIPTS\filearea.cgi line 455, line 172. assessment in Mathematics.

Without prejudice to this suggestion the debate should range widely over educational issues as well as technical ones. Do the current computer assessment packages merely reinforce surface learning and have nothing to contribute to deep understanding? Is it possible to provide sufficient flexibility in test editors so that questions suitable for diagnostic, continuous and grading assessment can be constructed? Which question types best test which mathematical skills? What about the security issues for assessment delivered at a distance? Much remains to be resolved but may the debate begin?

References

[1] C E Beevers, Assessment Aftermath!, Keynote presentation at Int Conf Tech Math Teach, Napier University, p46-58, March 1996

[2] R D Harding and D A Quinney, Mathwise and the UK Mathematics Courseware Consortium, Active Learning 4, 53-57 (1996)

[3] C E Beevers, G R McGuire, G Stirling and D G Wild, Mathematical Ability Assessed by Computer, J Comp Educ 25, 123-132 (1995)

[4] C E Beevers, M A Youngson, G R McGuire, D G Wild and D J Fiddes, Issues of Partial Credit in Mathematical Assessment by Computer, Alt-J 7, 26-32 (1999)

[5] T D Scott, Use of the MANs - a Scottish Initiative, Proc IMA Conf on the Mathematical Education of Engineers, University of Loughborough, April 1997

[6] Mathwise Pre-Calculus and Calculus CDs, Numerical Algorithms Group, Oxford (1997 and 2000)

[7] D G Wild, C E Beevers, D J Fiddes, G R McGuire and M A Youngson, Interactive PastPapers for A-Level and Higher Mathematics, Lander Educational Software, Glasgow (1997 and 1998)

[8] M Greenhow, Setting objective tests in mathematics with QM Designer, Newsletter 1 of the Learning and Teaching Support Network Centre for Maths, Stats and OR, Feb 2000