Sorry, we don't support your browser.  Install a modern browser
This post is closed.

Inter-rater reliability / Kappa statistics on Dual Screening#457

When Dual Screening, the system allows me to adjudicate individual decisions but I am not able to determine the inter-rater reliability (IRR) within the software. IRR would help me track the accuracy of underlying screeners and also may be published as part of the screening results.

Could an IRR / Kappa statistic be added so that adjudicators/administrators can track the accuracy of underlying screeners in Dual Screening?

2 years ago
Changed the status to
Under Consideration
2 years ago

This feature was added a while back on Dashboard, under “Screening Agreement”. Enjoy!

a year ago
1
Changed the status to
Completed
a year ago