|Title||Measuring and reporting intercoder reliability in plan quality evaluation research|
|Publication Type||Journal Article|
|Year of Publication||2014|
|Authors||Stevens, M, Lyles, W, Berke, PR|
|Journal||Journal of Planning Education and Research|
|Keywords||content analysis, intercoder reliability, Krippendorff’s alpha, percent agreement, plan evaluation, plan quality|
Plan quality evaluation researchers typically evaluate plans in relation to whether they contain certain desirable features. Best practice dictates that plans be evaluated by at least two readers and that researchers report a measure of the extent to which the readers agree on whether the plans contain the desirable features. Established practice for assessing this agreement has been subject to criticism. We summarize this criticism, discuss an alternative approach to assessing agreement, and provide recommendations for plan quality evaluation researchers to follow to improve the quality of their data and the manner in which they assess and report that quality.