Cohen's Kappa: What it is, when to use it, and how to avoid its pitfalls | by Rosaria Silipo | Towards Data Science
GitHub - thomaspingel/cohens-kappa-matlab: This is a simple implementation of Cohen's Kappa statistic, which measures agreement for two judges for values on a nominal scale. See the Wikipedia entry for a quick overview,
![Interrater agreement statistics with skewed data: evaluation of alternatives to Cohen's kappa. | Semantic Scholar Interrater agreement statistics with skewed data: evaluation of alternatives to Cohen's kappa. | Semantic Scholar](https://d3i71xaburhd42.cloudfront.net/13adb18beef581e51f712088eb7bd40afb4ee66d/3-Table2-1.png)
Interrater agreement statistics with skewed data: evaluation of alternatives to Cohen's kappa. | Semantic Scholar
![Beyond kappa: an informational index for diagnostic agreement in dichotomous and multivalue ordered-categorical ratings | SpringerLink Beyond kappa: an informational index for diagnostic agreement in dichotomous and multivalue ordered-categorical ratings | SpringerLink](https://media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs11517-020-02261-2/MediaObjects/11517_2020_2261_Figd_HTML.png)