Skip to content

Lancaster Glossary of Child Development

Inter-rater reliability

Posted byBrian Hopkins May 22, 2019

A measure of agreement between raters such as the intraclass correlation or kappa, the latter accounting for chance agreements. 

See Cohen’s kappa coefficient, Measurement efficacy, Reliability

Posted byBrian HopkinsMay 22, 2019Posted inUncategorizedTags: glossary

Post navigation

Previous Post Previous post:
Intentional stance
Next Post Next post:
Interaction
Lancaster Glossary of Child Development, Proudly powered by WordPress.