Chance Agreement

Facebooktwitterredditpinterestlinkedinmail

Se for this table is 81/90 = .90, which is quite large. But here the test and criterion are statistically independent: the level of Se is exactly what one would expect if the test results were random – for example. B by throwing a deformed part with pr (heads) = .90. This example shows that if the limit rates are extreme, chance alone can generate a high level of se. This is exactly the argument that was originally used to justify the use of kappa as an indicator of compliance, rather than simply declaring the crude rate of agreement, po. Another example of a comparison between Chi-Carré and Kappa is the distribution of agreements in Table IV. Here χ2 = 6.25 (p < 0.02), against κ = 0.20. Although chi-square is significant, kappa`s value indicates little match.

Facebooktwitterredditpinterestlinkedinmail