Ask the expert: What is inter-rater reliability (free form)?

Inter-rater reliability is the extent to which two or more individuals (or raters) agree. In the context of medical staff peer review, inter-rater reliability can be defined as the extent to which two separate reviewers come to the similar conclusion regarding a physician’s performance. It also applies to a single reviewer coming to the same or similar conclusions when reviewing like cases.  

Some physicians worry that there is little consistency or inter-rater reliability in the peer review process, but standardizing the review process goes a long way toward addressing this concern. 

The Greely Company has developed a standardized method of scoring individual cases. The case scoring system uses different scoring categories to help physician reviewers make efficient and consistent determinations when reviewing a patient chart. It also focuses the committee’s discussion of each case in a consistent manner.

This week’s question and answer are adapted from Measuring Physician Competency: How to Collect, Assess, and Provide Performance Data, Second Edition.