MetaTOC stay on top of your field, easily

Coefficient Lambda for Interrater Agreement Among Multiple Raters: Correction for Category Prevalence

Educational and Psychological Measurement

Published online on

Abstract

Educational and Psychological Measurement, Volume 86, Issue 2, Page 287-319, April 2026.
Fleiss’s Kappa is an extension of Cohen’s Kappa, developed to assess the degree of interrater agreement among multiple raters or methods classifying subjects using categorical scales. Like Cohen’s Kappa, it adjusts the observed proportion of agreement to ...