rater classification

W.MOYLE@qut.edu.au
Wed, 6 Apr 94 08:25 +1000

I have been asked by a friend to place this message on the network
as he does not have access to bulletin boards. I hope someone can help
him.

Advice is sought regarding methods to analyse 3 raters classification
of 320 constructs (items) into 22 categories. A measure of agreement
is required to analyse;
a) agreement between raters, eg. rater 1 with rater 2
b) agreement overall between the 3 raters
c) the raliability of the classification of constructs into particular
categories.

Content analysis literature suggests, 1) an agreement coefficient
(correcting for chance) is preferable to a correlation, and 2)
coincidence matices may be useful.

Any assistance would be appreciated.
Please forward any replies to me and I will pass them on.
Thanks
Wendy Moyle w.moyle@qut.edu.au