Print

Print


Dear All,

We have an academic who has run a peer only assessment with 5 questions using split 100. We're running version 1.0.0.3 and for some reason the fudge factor hasn't been applied to those groups with a non-submitter. Is this a bug? Is this fixed in later versions?

In the meantime, we're trying to work out the calculation manually but have come up with an issue with how the fudge factor is calculated. The fudge factor is calculated using the number of students in the group divided by the number of students who submitted. This is fine when you have self and peer assessment, but when it's peer only, the fudge factor cannot be applied to the student who has not submitted as they have received all the marks they should have. If we keep the fudge factor as calculated, then the total of the WebPA scores is less than the number of students. Obviously we can tweak this manually to make the WebPA score total correct, but has anyone solved this issue? Is there a common calculation we can apply?

Thanks in advance for your help. Please let me know if you need me to send an example of the calculations.

Best wishes,
Julie


Julie Voce
E-Learning Services Manager
ICT
Imperial College London
Level 4, Sherfield Building
Exhibition Road
London SW7 2AZ

Tel:  +44 (0)20 7594 6293
Email: [log in to unmask]