Print

Print


Could anyone tell me about statistics which can be used to evaluate 
inter-rater reliability?
Health workers were asked to use a new tool to rate three different case 
studies. What we want to know is whether within groups (nurses, dieticians 
etc) the ratings were similar. I have heard of Cohen's Kappa but I believe 
this is only for two raters, and we have 2-10 in each group.
any suggestions appreciated
thanks Emily



%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%