Skip to content
#

inter-rater-agreement

Here are 11 public repositories matching this topic...

Evaluation and agreement scripts for the DISCOSUMO project. Each evaluation script takes both manual annotations as automatic summarization output. The formatting of these files is highly project-specific. However, the evaluation functions for precision, recall, ROUGE, Jaccard, Cohen's kappa and Fleiss' kappa may be applicable to other domains too.

  • Updated Feb 10, 2017
  • Python

Improve this page

Add a description, image, and links to the inter-rater-agreement topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the inter-rater-agreement topic, visit your repo's landing page and select "manage topics."

Learn more