Inter-rater agreement | ||||
---|---|---|---|---|
Rater A | Rater B | ICC (95% CI) | Kappa§ (95% CI) | |
Total sample (n = 158) | 4.1 (2.3)† | 4.2 (2.5) | 0.926 (0.899–0.946) ‡ | 0.79 (0.69–0.89) |
NP (n = 99) | 5.0 (2.2) † | 5.1 (2.4) | 0.925 (0.888–0.949) ‡ | 0.68 (0.49–0.86) |
NNP (n = 59) | 2.6 (1.8)† | 2.7 (2.0) | 0.840 (0.730–0.905) ‡ | 0.72 (0.50–0.93) |
Internal consistency (Cronbach's α1) | ||||
0.65 | 0.71 | |||
Test-retest reliability | ||||
Test | Re-test | ICC (95% CI) | Kappa§ (95% CI) | |
Retest sample (n = 67) | 3.7 (2.5)† | 3.6 (2.6) | 0.949 (0.916–0.969) ‡ | 0.79 (0.64–0.94) |
NP (n = 37) | 4.7 (2.4) † | 4.4 (2.5) | 0.952 (0.904–0.976) ‡ | 0.75 (0.53–0.98) |
NNP (n = 30) | 2.5 (2.0)† | 2.7 (2.4) | 0.923 (0.840–0.963) ‡ | 0.78 (0.56–1.00) |