Print

Print


Please find below details of an opening for a suitably qualified masters graduate and circulate these details as appropriate. 
Potential applicants are most welcome to make further enquiries regarding this opportunity by contacting 
Margaret MacDougall ([log in to unmask]).
 
Please note the deadline of 17 April 2009 for submission of application details.
================================================================ 
*Community Health Sciences*
 
*ESRC +3 (PhD) Quota Award Studentships for 2009-10
 
**Statistics, Methods and Computing**
  
The Community Health Sciences ESRC Outlet at the University of Edinburgh is pleased to be able to offer a +3 PhD Studentship for 2009.  Our category is Statistics, Methods and Computing.   The studentship is open to *_UK/EU citizens only_*, as per ESRC regulations.  We invite all those interested to read the ESRC regulations in full at
http://www.esrc.ac.uk/ESRCInfoCentre/opportunities/postgraduate/fundingopportunities/1plus3_quota.aspx?ComponentId=13826&SourcePageId=304
<http://www.esrc.ac.uk/ESRCInfoCentre/opportunities/postgraduate/fundingopportunities/1plus3_quota.aspx?ComponentId=13826&SourcePageId=304>
 
To be considered, you must forward the following documents:
 
    * A recent CV
    * A statement of interest
    * Degree/Transcript Copies
    * Two email reference contacts or academic reference letters
 
 
To:  [log in to unmask] <mailto:[log in to unmask]> or Attn: Sarah McAllister, Community Health Sciences, University of Edinburgh, 
Teviot Place, Edinburgh EH8 9AG
 
Deadline:  *_ Friday, 17th April 2009_*
 
Title: Modern methods for detecting examiner bias in naturalistic assessment data
 
Supervisors: Dr Pam Warner (Reader in Medical Statistics), Dr Margaret MacDougall (Medical Statistician)
Research collaborator: Dr Simon C Riley, Senior Lecturer and Director of SSCs on MBChB programme
 
Project summary Within a plethora of educational contexts, students are ranked against one another for future selection purposes. The outcomes of these processes ultimately impinge upon their future career choices and on the availability of suitable candidates for professional positions where the safety of the patient or client is at stake. It is therefore essential to estimate the reliability of assessment data more accurately by having effective procedures in place to test for the presence of examiner bias. Such evidence ought to consist in a valid estimate of effect size together with a confidence interval for estimating the variability inherent across sample data.
 
This project aims to extend recent methodology involving hypothesis tests for detecting examiner bias1 to more naturalistic cases in which non-parametric assessment data cannot readily be Normalized, data may be categorical and sample sizes are small.  The successful candidate will have access to a rich and ongoing large database of examinee ratings for undergraduate medical students, commencing from 2001.  This will facilitate the application of bootstrapping and Monte Carlo techniques to examine hypothesis test robustness and enhance the derivation of new approaches to obtaining associated confidence intervals.  The research will apply specifically to cases where student scripts are double-marked, with one examiner having prior knowledge of student performance in a separate but related assessment. An emphasis will be placed on distinguishing between natural tendencies of individual students to perform well or poorly overall and consistency in
 performance influenced by examiner bias based on halo or horn effects.  The research outcomes will therefore be of wide relevance across the social and physical sciences, where double marking is often viewed as a reputable procedure for enhancing score reliability. Additionally, they will serve as a useful complement to current methodologies for detecting leniency and hawk effects within the more general literature on examiner bias.  The final phase of the project will involve a more qualitative component requiring management of focus group sessions, with examiners as participants. The findings from this work will be used to explore the underlying sociocultural mechanisms which lead to halo and horn effects. This will in turn help to address the more philosophical issue of the extent to which such effects ought to be modelled as measurement error when estimating score reliability.
 
Applications are sought from candidates with a good first degree (at least 2:1) and Masters degree in psychology (or a related social science) with a substantial quantitative component or in the mathematical sciences. This opportunity may be of interest to individuals wishing to pursue a career in educational research as an assessment professional or to acquire solid methodological training for a career as a social statistician.
 
Additional requirements
 
Essential: postgraduate level training in psychometrics or educational psychology, normally as part of a Masters degree programme;  experience in the application of statistical regression techniques, including Generalized Linear Modelling
 
Desirable: experience in the application of intraclass correlation coefficients for a variety of research designs; experience in the use of MPlus, R or Stata and in the application of bootstrapping and Monte Carlo techniques, experience in the construction of in vivo codes from transcript data. 
 
1.         MacDougall M, Riley SC, Cameron HS, McKinstry B. Halos and horns in the assessment of undergraduate medical students: a consistency based approach. Journal of Applied Quantitative Methods. 2008;3(2):116 - 28.