rater classification

W.MOYLE@qut.edu.au
Wed, 6 Apr 94 08:00 +1000

I am placing this request for advice for a friend who does not yet have
access to bulletin boards. I hope someone can help him.

Advice is sought regrding methods to analyse 3 raters classification of
320 constructs (items) into 22 categories. A measure of agreement is
required to analyse,
a. agreement between raters eg. rater 1 with rater 2
b. agreement overall between the 3 raters
c. the reliability of the classification of constructs into particular
categories

Content analysis literature suggests 1) an agreement coefficent
(correcting for chance) is preferable to a correlation, and 2)
coincidence matices may be useful.
Any assistance would be appreciated.

If you return any replies to me I will pass them on to him.

Hoping someone can help.
Wendy Moyle w.moyle@qut.edu.au